![]() fluorescence imaging in a light-deficient environment
专利摘要:
The present invention relates to an endoscopic imaging system for use in a light-deficient environment that includes an imaging device that has a tube, one or more image sensors, and a lens assembly that includes at least one optical element that corresponds to one or more image sensors. The endoscopic system includes a monitor for a user to view a scene and an image signal processing controller. The endoscopic system includes an illumination mechanism that has an illumination source that generates one or more pulses of electromagnetic radiation and a lumen that transmits one or more pulses of electromagnetic radiation to a distal tip of an endoscope. 公开号:BR112020012744A2 申请号:R112020012744-2 申请日:2018-12-27 公开日:2020-12-01 发明作者:Joshua D. Talbert;Donald M. Wichern 申请人:Ethicon Llc; IPC主号:
专利说明:
[001] [001] Technological advances have enabled advances in imaging capabilities for medical use. An endoscope can be used to view the inside of a body and examine the inside of an organ or body cavity. Endoscopes can be used to investigate a patient's symptoms, confirm a diagnosis, or apply medical treatment. A medical endoscope can be used to view various systems and body parts including, for example, the gastrointestinal tract, respiratory tract, urinary tract, abdominal cavity through a small incision, etc. Endoscopes can be additionally used for surgical procedures such as plastic surgery procedures, procedures performed on joints or bones, procedures performed on the neurological system, procedures performed within the abdominal cavity, and so on. [002] [002] Endoscopes have also been used in non-medical fields to visualize and inspect spaces that may be inaccessible or difficult to see. For example, endoscopes can be used by planners or architects to visualize scale models of proposed buildings or cities. Endoscopes can be used to visualize the inner space of a complex system such as a computer. Endoscopes can even be used by law enforcement or military officials to perform surveillance in tight spaces or to examine explosive devices. [003] [003] Among their many uses, endoscopes can be beneficial for visualizing a space in color. A digital color image can include at least three layers, or "color channels" for each pixel in the image. Each of the color channels measures the intensity and chrominance of light for a spectral band. Commonly, a digital color image includes a color channel for the red, green, and blue spectral bands of light (this may be called an RGB image). The red, green, and blue color channels each include brightness information for the red, green, or blue spectral band of light. Brightness information from the separate red, green and blue layers can be combined to create a digital color image. Because a color image is composed of separate layers, a digital camera image sensor commonly includes a color filter matrix that allows visible red, green, and blue wavelengths of light to reach selected pixel sensors. Each individual pixel sensor element is sensitized to red, green or blue wavelengths and will only return image data for that wavelength. The image data from the total array of pixel sensors is combined to generate the RGB image. [004] [004] In the case of endoscopic imaging for medical diagnoses or medical procedures, it may be beneficial or even necessary to visualize a body cavity with color images. For example, if an endoscope is used to view the abdominal cavity of a body, a color image can provide valuable information to help identify different organs or tissues within the abdomen or to identify certain conditions or diseases within the space. As discussed above, a digital camera capable of capturing color images can have at least three distinct types of pixel sensors to individually capture the red, green and blue layers of the color images. The at least three distinct types of pixel sensors can consume relatively significant physical space (when compared to a color-agnostic pixel array), so the full pixel array cannot fit into the small distal end of the endoscope. that is inserted into the body. Since color digital cameras can include at least three distinct types of pixel sensors, a total pixel array (i.e., the image sensor) is commonly located on a grip unit of an endoscope that is held by an endoscope. endoscope operator and is not placed inside the body cavity. For such an endoscope, light is transmitted along the length of the endoscope from the grip unit to the distal end of the endoscope which is placed within the body cavity. This endoscope configuration has significant limitations. Endoscopes with this configuration are delicate and can be easily misaligned or damaged when bumped or impacted during regular use. This can significantly degrade the quality of images generated by the endoscope and require the endoscope to be frequently repaired or replaced. [005] [005] In some cases, and particularly in the case of medical imaging or medical procedures, it may be beneficial to see more than one color image. Color images reflect what the human eye detects when looking at an environment. However, the human eye is limited to seeing only visible light and cannot detect other wavelengths in the electromagnetic spectrum. At wavelengths other than the wavelengths of "visible light" in the electromagnetic spectrum, additional information about an environment can be obtained. One method of detecting additional information about an environment, beyond what the human eye is capable of detecting, is the use of fluorescent reagents. In the case of imaging for medical purposes, fluorescent reagents can provide a unique view of a body cavity that highlights certain tissues, structures or conditions that the human eye or a computer program cannot detect in an RGB image. [006] [006] Fluorescence is the emission of light by a substance that has absorbed light or other electromagnetic radiation. Certain fluorescent materials may "glow" or emit a distinct color that is visible to the human eye when the fluorescent material is subjected to ultraviolet light or other wavelengths of electromagnetic radiation. Certain fluorescent materials will stop glowing almost immediately when the radiation source stops. [007] [007] Fluorescence occurs when an orbital electron of a molecule, atom or nanostructure is excited by light or other electromagnetic radiation and then relaxes to its ground state by the emission of a photon from the excited state. The specific frequencies of electromagnetic radiation that excite the orbital electron, or that are emitted by the photon during relaxation, depend on the particular atom, molecule, or nanostructure. In most cases, the light emitted by the substance has a longer wavelength and therefore less energy than the radiation that has been absorbed by the substance. However, when the electromagnetic radiation absorbed is intense, it is possible for an electron to absorb two photons. This absorption of two photons can lead to the emission of radiation having a shorter wavelength and therefore greater energy than the absorbed radiation. Additionally, the emitted radiation may also have the same wavelength as the absorbed radiation. [008] [008] Fluorescence imaging has several practical applications, including in mineralogy, gemology, medicine, spectroscopy for chemical sensors, detection of biological processes or signals, etc. Fluorescence can be particularly used in biochemistry and medicine as a non-destructive means for tracking or analyzing biological molecules. Biological molecules, including certain tissues or structures, can be screened by analyzing the fluorescent emission of biological molecules after being excited by a certain wavelength of electromagnetic radiation. However, relatively few cellular components are naturally fluorescent. In certain implementations, it may be desirable to visualize a certain tissue, structure, chemical process, or biological process that is not intrinsically fluorescent. In such an implementation, a dye or reagent may be administered to the body which may include a molecule, protein or quantum dot having fluorescent properties. The reagent or dye can then fluoresce after being excited by a certain wavelength of electromagnetic radiation. Different reagents or dyes may include different molecules, proteins and/or quantum dots that will fluoresce at certain wavelengths of electromagnetic radiation. Thus, it may be necessary to excite the reagent or dye with a band of specialized electromagnetic radiation to fluoresce and identify the desired tissue, structure, or process in the body. [009] [009] Fluorescence imaging can provide valuable information in the medical field that can be used for diagnostic purposes and/or that can be viewed in real time during a medical procedure. Specialized reagents or dyes can be administered to a body to fluoresce in certain tissues, structures, chemical processes or biological processes. Reagent or dye fluorescence can highlight body structures such as blood vessels, nerves, organs in particular, etc. Additionally, the fluorescence of the reagent or dye can highlight conditions or diseases such as cancer cells or cells undergoing some biological or chemical process that may be associated with a condition or disease. Fluorescence imaging can be used in real time by a medical professional or computer program to distinguish between, for example, cancer cells and non-cancer cells during a surgical tumor removal. Fluorescence imaging can additionally be used as a non-destructive means to track and visualize, over time, a condition in the body that would not otherwise be visible to the human eye or distinguishable in an RGB image. [0010] [0010] However, fluorescence imaging requires specialized emissions of electromagnetic radiation and may additionally require specialized imaging sensors capable of reading the wavelength of electromagnetic radiation that is emitted by the structure or reagent that fluoresces. Different reagents or dyes may be sensitive to different wavelengths of electromagnetic radiation and may additionally emit different wavelengths of electromagnetic radiation when they have fluoresced. The imaging systems can then be highly specialized and tailored for a particular reagent or dye, so that the system is configured to emit particular wavelengths of electromagnetic radiation and includes imaging sensors configured to read particular wavelengths of radiation. electromagnetic. These imaging systems can be useful in very limited applications and may not be able to fluoresce more than one reagent or structure during a single imaging session. It can be very expensive to need multiple distinct imaging systems that are each configured to fluoresce a particular reagent or dye. Additionally, it may be desirable to administer multiple reagents or dyes that are each [0011] [0011] Additionally, it may be desirable to overlay fluorescence imaging over a black and white or color image to provide context for a medical professional or computer algorithm. Historically, this would require the use of a camera (or multiple cameras) having several different types of pixel sensors that are each sensitive to different ranges of electromagnetic radiation. This may include the three separate types of pixel sensors to generate a color RGB image using conventional methods, in combination with additional pixel sensors to generate the fluorescence image data at different wavelengths of the electromagnetic spectrum. This can consume relatively large physical space and requires a large pixel array to ensure that the image resolution is satisfactory. In the case of endoscopic imaging, the camera or cameras can be placed on a manual unit or robotic unit of the endoscope, as the multiple wavelength-sensitive pixel sensors require a lot of physical space and require a very large pixel array to be placed at the distal end of the endoscope within the body cavity. This generates the same disadvantages mentioned above and can cause the endoscope to be very delicate so that the image quality is significantly degraded when the endoscope is bumped or impacted during use. [0012] [0012] This description generally refers to electromagnetic detection and sensors that may be applicable to endoscopic imaging. The description also refers to low energy electromagnetic input conditions as well as low energy electromagnetic emission conditions. The description refers more particularly, but not entirely necessarily, to a system for producing an image in light-deficient environments and associated structures, methods, and resources, which may include controlling a light source by duration, intensity, or both. , pulsing a component-controlled light source during the blanking period of an image sensor, maximizing blanking period to allow optimal light, and maintaining color balance. [0013] [0013] The features and advantages of the description will be presented in the description that follows and, in part, will become evident from the description, or may be learned by practicing the description without undue experimentation. The features and advantages of the present description can be realized and obtained by means of the instruments and combinations particularly highlighted in the appended claims. BRIEF DESCRIPTION OF THE DRAWINGS [0014] [0014] Non-limiting and non-exhaustive implementations of the description are described with reference to the following figures, with similar reference numbers referring to similar parts throughout the various views unless otherwise specified. The advantages of the description will be better understood with reference to the following description and accompanying drawings, where: [0015] [0015] Figure 1 is a schematic view of an embodiment of a paired sensor and electromagnetic emitter system in operation for use in producing an image in a light-deficient environment, according to an embodiment; [0016] [0016] Figure 2 is a schematic view of an add-on system hardware; [0017] [0017] Figures 2A to 2D are illustrations of the operational cycles of a sensor used to build an image frame, according to description modalities; [0018] [0018] Figure 3 is a graphic representation of the operation of a modality of an electromagnetic emitter, according to a modality; [0019] [0019] Figure 4 is a graphic representation of the variation of the duration and magnitude of the electromagnetic pulse emitted to provide exposure control, according to a modality; [0020] [0020] Figure 5 is a graphic representation of a description mode that combines the operational cycles of a sensor, the electromagnetic emitter and the electromagnetic pulses emitted from Figures 2A to 4, which demonstrate the imaging system during operation, according to with a modality; [0021] [0021] Figure 6 illustrates a schematic of two distinct processes over a time period from t(0) to t(1) to record a video frame for full-spectrum light and partitioned-spectrum light, according to an embodiment; [0022] [0022] Figures 7A to 7E illustrate schematic views of the processes during a time interval to record a frame of video for both full-spectrum and partitioned-spectrum light in accordance with the principles and teachings of the description; [0023] [0023] Figures 8 to 12 illustrate the adjustment of both the electromagnetic emitter and the sensor, and said adjustment can be performed simultaneously in some modalities according to the principles and teachings of the description; [0024] [0024] Figures 13 to 21 illustrate sensor correction methods and hardware schematics for use with a partitioned light system, in accordance with embodiments of the description; [0025] [0025] Figures 22 to 23 illustrate a method and hardware schematics to increase dynamic range within a closed environment or with limited light, according to embodiments of the description; [0026] [0026] Figure 24 illustrates the impact on the signal-to-noise ratio of color correction for a typical Bayer-type sensor compared to the absence of color correction; [0027] [0027] Figure 25 illustrates the chromaticity of 3 monochromatic lasers compared to the sRGB range; [0028] [0028] Figures 26 to 27B illustrate a method and hardware schematics to increase dynamic range within a closed environment or with limited light, according to embodiments of the description; [0029] [0029] Figures 28A to 28C illustrate the use of an emission of white light that is pulsed and/or synchronized with a corresponding color sensor, in accordance with embodiments of the description; [0030] [0030] Figures 29A and 29B illustrate an implementation having a plurality of pixel arrays for producing a three-dimensional image, in accordance with embodiments of the description; [0031] [0031] Figures 30A and 30B illustrate a perspective view and a side view, respectively, of an implementation of an imaging sensor built on a plurality of substrates, wherein a plurality of pixel columns that form the pixel matrix are located on the first substrate and a plurality of circuit columns is located on a second substrate and shows an electrical connection and communication between a column of pixels and its associated or corresponding circuit column; [0032] [0032] Figures 31A and 31B illustrate a perspective view and a side view, respectively, of an implementation of an imaging sensor having a plurality of pixel arrays for producing a three-dimensional image, the plurality of pixel arrays and the image sensor are built on a plurality of substrates; [0033] [0033] Figures 32 to 36 illustrate modalities of emitters that comprise various configurations of mechanical filter and shutter, according to modalities of the description; [0034] [0034] Figure 37 is a schematic diagram illustrating a system for providing lighting to a light-deficient environment in accordance with one embodiment; [0035] [0035] Figure 38 is a schematic block diagram illustrating a light source having a plurality of emitters, according to one embodiment; [0036] [0036] Figure 39 is a schematic block diagram illustrating a light source having a plurality of emitters, according to another embodiment; [0037] [0037] Figure 40 is a schematic block diagram illustrating a light source having a plurality of emitters, according to yet another embodiment; [0038] [0038] Figure 41 is a schematic diagram illustrating a single optical fiber emitting through a diffuser at an output to illuminate a scene, according to one embodiment; [0039] [0039] Figure 42 is a block diagram that illustrates the generation of a filtered image using a filter, according to an embodiment; [0040] [0040] Figure 43 illustrates a portion of the electromagnetic spectrum divided into a plurality of different subspectra that can be emitted by emitters from a light source, according to an embodiment; [0041] [0041] Figure 44 is a schematic diagram illustrating a timing diagram for emission and reading in order to generate a multispectral or hyperspectral image, according to an embodiment; [0042] [0042] Figure 45 is a block diagram that illustrates the generation of a filtered image using a filter, according to an embodiment; [0043] [0043] Figure 46 is a block diagram illustrating the generation of a filtered image using a plurality of filters, according to one embodiment; [0044] [0044] Figure 47 is a schematic diagram illustrating a grid matrix for object and/or surface tracking, according to one embodiment; [0045] [0045] Figure 48 is a schematic flow diagram illustrating a method for emitting and reading to generate a multispectral or hyperspectral image, according to an embodiment; and [0046] [0046] Figure 49 is a schematic flow diagram illustrating a method for emitting and reading to generate a fluorescence image, according to one embodiment. DETAILED DESCRIPTION [0047] [0047] The description extends to computer-based methods, systems and products for digital imaging that may be primarily suitable for medical applications such as endoscopic medical imaging. Such computer-based methods, systems, and products as described in the present invention can provide imaging or diagnostic capabilities for use in robotic medical applications, such as using robotics to perform imaging procedures, surgical procedures, and the like. In the description of the description below, reference is made to the accompanying drawings, which form part thereof, and in which specific implementations in which the description may be put into practice are shown, by way of illustration. It is understood that other implementations can be used and that structural changes can be made without departing from the scope of the description. [0048] [0048] Endoscopes have a wide variety of uses and can provide significant benefits in the medical field. Endoscopy is used in medicine to visualize the inside of a body and, in some cases, can provide imaging that would otherwise be impossible to see or would require invasive surgical procedures. Endoscopes can be used for medical diagnosis, investigation, or research, and they can also be used to perform medical procedures in a minimally invasive manner. Medical endoscopes can provide significant benefits to patients and medical professionals by eliminating the need for painful and invasive corrective or exploratory surgery. [0049] [0049] As described in the present invention, an endoscopic system for use in a light-deficient environment, such as a body cavity, may include an imaging device and a light mechanism. The light mechanism may include an illumination source for generating pulses of electromagnetic radiation and may additionally include a lumen for transmitting pulses of electromagnetic radiation at a distal tip of an endoscope. The lumen can transmit electromagnetic radiation pulses in particular wavelengths or wavelength bands of the electromagnetic spectrum. The lumen can transmit these pulses in a timed sequence and imaging data can be captured by a sensor during each of the pulses. The imaging data associated with the different wavelengths of the pulses can be used to generate a red, green and blue (RGB) image and/or fluorescence images. In one embodiment, the fluorescence imaging may be superimposed over a black and white or RGB image. [0050] [0050] As described in the present invention, systems, methods and devices for an endoscopic imaging system can provide specialized image data from a light-deficient environment. Specialized image data can be used to generate fluorescence imaging and/or identify certain materials, fabrics, components or processes within a light-deficient environment. In certain embodiments, fluorescence imaging can be provided to a medical professional or computer-implemented program to allow identification of certain structures or tissues within a body. This fluorescence imaging data can be overlaid on black and white or RGB images to provide additional information and context. [0051] [0051] Additionally, these systems, methods and devices for an endoscopic imaging system can be used in coordination with certain reagents or dyes. In a medical imaging implementation, certain reagents or dyes may be administered to a patient, and these reagents or dyes may fluoresce or react with certain wavelengths of electromagnetic radiation. The endoscopic imaging system as described in the present invention can transmit electromagnetic radiation at specified wavelengths to fluoresce the reagents or dyes. The fluorescence of the reagents or dyes can be captured by an image sensor to generate imaging to aid in the identification of tissues or structures and/or to aid in diagnosis or research. In one implementation, a patient may be administered a plurality of reagents or dyes that are each configured to fluoresce at different wavelengths and/or provide an indication of different structures, tissues, chemical reactions, biological processes, etc. In such an implementation, the endoscopic system as described in the present invention can emit each of the applicable wavelengths to fluoresce in each of the applicable reagents or dyes. This can eliminate the historical need to perform individual imaging procedures for each of a plurality of reagents or dyes. [0052] [0052] Medical endoscopes can provide a continuous stream of digital imaging of an internal space of a body where a distal end of the endoscope is inserted. In many implementations, it may be beneficial or even necessary for the digital image stream to provide full color imaging so that a medical professional can better distinguish between tissues and structures in the body. In additional implementations, it may be beneficial to provide hyperspectral imaging data to allow for more accurate differentiation between structures, tissues, processes and conditions. Additionally, hyperspectral imaging can allow a medical professional or a computer program to receive information about a condition in a human body that is not visible to the human eye or discernible in a color RGB image. [0053] [0053] Described in the present invention are systems, methods and devices for generating color image data and/or fluorescence image data by an endoscope. A system of the description includes an imaging device having a tube, one or more image sensors and a lens array. The lens assembly may include at least one optical element corresponding to at least one of the one or more image sensors. The system additionally includes a monitor for viewing a scene and an image signal processing controller. The system may additionally include a light mechanism. The light engine includes an illumination source configured to generate one or more pulses of electromagnetic radiation and a lumen that transmits one or more pulses of electromagnetic radiation at a distal tip of an endoscope. In one embodiment, at least a portion of the one or more pulses of electromagnetic radiation includes an excitation wavelength of electromagnetic radiation between 795 nm and 815 nm that causes one or more reactants to fluoresce at a wavelength that is different of the excitation wavelength of the portion of the one or more pulses of electromagnetic radiation. [0054] [0054] In one embodiment of the description, an endoscopic system illuminates a source and pulses electromagnetic radiation at a certain wavelength to excite an electron in a reagent or dye. In one embodiment, the reagent or dye is configured to fluoresce in response to the particular wavelength of electromagnetic radiation that is emitted by the endoscopic system. An image sensor in the endoscopic system can read a fluorescence relaxation emission from the reagent or dye that may be of lower energy than pulsed electromagnetic radiation to excite the reagent or dye. The reagent or dye may be specialized to label a certain tissue, structure, biological process and/or chemical process. [0055] [0055] “Imaging reagents, including fluorescent reagents, can improve imaging capabilities in the pharmaceutical, medical, biotechnology, diagnostics and medical procedure industries. Several imaging techniques such as X-rays, computed tomography (CT), ultrasound, magnetic resonance imaging (MRI) and nuclear medicine, primarily analyze anatomy and morphology and are unable to detect changes at the molecular level. Fluorescent reagents, dyes and probes, including quantum dot nanoparticles and fluorescent proteins, can assist medical imaging technologies by providing additional information about certain tissues, structures, chemical processes and/or biological processes that are present within the imaging region. Imaging using fluorescent reagents may allow cell tracking and/or tracking of certain molecular biomarkers. Fluorescent reagents can be applied for imaging cancer, infection, inflammation, stem cell biology and others. Various fluorescent reagents and dyes are being developed and applied to visualize and track biological processes in a non-destructive manner. These fluorescent reagents can be excited by a particular wavelength or band of wavelengths of electromagnetic radiation. Similarly, these fluorescent reagents can emit relaxation energy at a certain wavelength or band of wavelengths when they fluoresce, and the emitted relaxation energy can be read by a sensor to determine the location and/or limits of the reagent or dye. [0056] [0056] In one embodiment of the description, an endoscopic system pulses electromagnetic radiation to excite an electron in a fluorescent reagent or dye. The wavelength or band of wavelengths of electromagnetic radiation can be particularly selected to fluoresce in a particular reagent or dye. In one embodiment, the endoscopic system can pulse multiple different wavelengths of electromagnetic radiation to fluoresce in multiple different reagents or dyes during a single imaging session. An endoscopic system sensor can determine a location and/or boundary of a reagent or dye based on the reagent or dye's relaxation emissions. The endoscopic system can additionally pulse electromagnetic radiation in the red, green and blue bands of visible light. The endoscopic system can determine data for an RGB image and for a fluorescence image according to a pulsation schedule for the electromagnetic radiation pulses. [0057] [0057] In one embodiment of the description, an endoscopic system illuminates a source and pulses electromagnetic radiation for spectral or hyperspectral imaging. Spectral imaging uses multiple bands along the electromagnetic spectrum. This is unlike conventional cameras which only capture light along the three wavelengths based on the visible spectrum that are discernible by the human eye, including red, green and blue wavelengths to generate an RGB image. Spectral imaging can utilize any wavelength bands in the electromagnetic spectrum, including infrared wavelengths, visible spectrum, ultraviolet spectrum, X-ray wavelengths, or any suitable combination of several wavelength bands. Spectral imaging can overlay imaging generated based on non-visible bands (e.g. infrared) over imaging based on visible bands (e.g. a standard RGB image) to provide additional information that is easily discernible by a person or algorithm of computer. [0058] [0058] Hyperspectral imaging is a subcategory of spectral imaging. Hyperspectral imaging includes spectroscopy and digital photography In a hyperspectral imaging modality, a full spectrum or some spectral information is collected at each pixel in an image plane. A hyperspectral camera can use special hardware to capture any suitable number of wavelength bands for each pixel which can be interpreted as a full spectrum. The purpose of hyperspectral imaging may vary for different applications. In one application, the goal of hyperspectral imaging is to obtain the entire electromagnetic spectrum of each pixel in an image scene. This can make it possible to find certain objects that otherwise cannot be identifiable under the wavelength bands of visible light. This may allow certain materials or fabrics to be accurately identified when those materials or fabrics cannot be identifiable under the wavelength bands of visible light. Additionally, this can allow certain processes to be detected by capturing an image along all wavelengths of the electromagnetic spectrum. [0059] [0059] Hyperspectral imaging can provide particular advantages over conventional imaging in medical applications. The information obtained by hyperspectral imaging may allow medical professionals and/or computer-implemented programs to accurately identify certain tissues or conditions that may lead to diagnoses that may not be possible or may be less accurate if conventional imaging is used, for example. , RGB imaging. Additionally, hyperspectral imaging can be used during medical procedures to enable image-guided surgery, which can allow a medical professional to, for example, visualize tissues located behind certain tissues or fluids, identify atypical cancer cells in contrast to healthy patterns, identify certain tissues or conditions, identify critically important structures, etc. Hyperspectral imaging can provide specialized diagnostic information about tissue physiology, morphology, and composition that cannot be generated with conventional imaging. [0060] [0060] Hyperspectral endoscopic imaging may have advantages over conventional imaging in various applications and implementations of the description. In medical implementations, hyperspectral endoscopic imaging can allow a medical professional or computer-implemented program to differentiate, for example, nerve tissue, muscle tissue, various vessels, the direction of blood flow, etc. Hyperspectral imaging can allow atypical cancerous tissue to be precisely differentiated from typical healthy tissue and can therefore allow a medical professional or computer-implemented program to identify the edge of a cancerous tumor during an operation or investigative imaging. Additionally, hyperspectral imaging in a light-deficient environment as described in the present invention can be combined with the use of a reagent or dye to allow further differentiation between certain tissues or substances. In this embodiment, a reagent or dye can fluoresce through a specific wavelength band in the electromagnetic spectrum and therefore provide information specific to the purpose of that reagent or dye. The systems, methods and devices as described in the present invention can allow any number of wavelength bands to be pulsed so that one or more reagents or dyes can fluoresce at different times. In certain implementations, this may allow the identification or investigation of multiple clinical conditions during a single imaging procedure. [0061] [0061] A medical endoscope can pulse electromagnetic radiation in wavelength bands outside the visible light spectrum to allow hyperspectral imaging. Endoscopic hyperspectral imaging is a non-contact, non-invasive means of medical imaging that does not require a patient to be exposed to harmful radiation common in other imaging methods. [0062] [0062] Conventional endoscopes, used in, for example, robotic endoscopic procedures such as arthroscopy and laparoscopy, are designed so that the image sensors are typically placed within a grip unit that is held by an endoscope operator and is not inserted into a cavity. In this configuration, an endoscope unit transmits incident light along the length of an endoscope tube toward the sensor through a complex set of precisely coupled optical components, with minimal loss and distortion. The cost of the endoscope unit is mainly due to the optical elements, since the optical components are expensive and the manufacturing process of the optical components is labor intensive. Additionally, this type of endoscope is mechanically delicate and relatively small impacts can easily damage components or upset the relative alignments of these components. Even minor misalignments of endoscope components (eg, precisely coupled optical components) can lead to significant degradation of image quality or render the endoscope unusable. When components are misaligned, the incident light traveling along the length of the endoscope may diminish so that there is little or no light at the distal end of the endoscope and the endoscope is rendered useless. Since conventional endoscopes require these precise and complex optical components and since these components can easily become misaligned, these conventional endoscopes require frequent and expensive repair cycles to maintain image quality. [0063] [0063] A solution to this problem is to place the image sensor inside the endoscope itself at the distal end. This solution can eliminate the need for complex and accurate collection of coupled optical components that can be easily misaligned and/or damaged. This solution potentially approaches the simplicity, robustness and optical economy that is universally realized, for example, inside cameras in cell phones. However, it must be recognized that a large part of the benefits offered by an endoscope arise from the compact size of the distal end of the endoscope. If the distal end of the endoscope is enlarged to accommodate the multiple distinct wavelength-sensitive pixel sensors conventionally used for color imaging or hyperspectral imaging, the pixel array may be too large and the endoscope may no longer fit into the small spaces or may cause obstruction or be invasive when used in a medical implementation. Since the distal end of the endoscope must remain very small, it is challenging to place one or more image sensors at the distal end. An acceptable solution to this approach is by no means trivial and presents its own range of engineering challenges, not least the fact that color sensors and/or hyperspectral imaging must fit within an area that is highly confined. This is particularly challenging when a pixel array in conventional cameras includes separate pixel sensors for each of the red, green, and blue visible light bands, with additional pixel sensors for other wavelength bands used for hyperspectral imaging. The distal endoscope tip area can be particularly confined side-by-side in the X and Y dimensions, while there is more space along the length of the endoscope tube in the Z dimension. [0064] [0064] Since many of the benefits of an endoscope are derived from the small size of the distal end of the endoscope, aggressive restrictions must be placed on the image sensor area when image sensors are located at the distal end. These aggressive restrictions placed on the sensor area naturally result in fewer pixels and/or smaller pixels within a pixel matrix. Reducing the pixel count can directly affect the spatial resolution, while reducing the pixel area can reduce the available signal capacity and thus the pixel sensitivity, as well as optimizing the number of pixels so that the image quality image increases, the minimum pixel resolution and the native number of pixels using the maximum quality and pixel pitch, so that resolution is not an issue, as well as signal to noise ratio (SNR) reduction ") of each pixel. Reducing signal capacity reduces dynamic range, that is, the ability of the imaging device or camera to simultaneously capture all useful information from scenes with wide ranges of brightness. There are several methods for extending the dynamic range of imaging systems beyond that of the pixel itself. However, all of them can have some sort of penalty (for example, in resolution or frame rate) and can introduce undesirable artifacts that become problematic in extreme cases. The consequence of reducing the sensitivity is that more light energy is required to bring the darkest regions of the scene to acceptable signal levels. Lowering the F-number (increasing the aperture) can compensate for a loss of sensitivity, but at the expense of spatial distortion and reduced depth of focus. [0065] [0065] In the sensor industry, complementary metal-oxide-semiconductor ("CMOS" - "complementary metal-oxide-semiconductor") semiconductor image sensors have charge-coupled device ("CCD") image sensors. conventional cameras largely displaced in modern camera applications. CMOS image sensors have greater ease of integration and operation, superior or comparable image quality, greater versatility, and lower cost compared to CCD image sensors. Typically, CMOS image sensors may include the circuitry necessary to convert image information into digital data and have various levels of digital processing built in afterwards. This can range from basic algorithms aimed at correcting non-idealities, which can, for example, arise from variations in the behavior of the amplifier, to complete chains of image signal processing (ISP - "Image signal processing"), providing data video in the standard red-green-blue ("RGB" - "red-green-blue") color space, for example (integrated circuit cameras). [0066] [0066] The control unit of an endoscope or image sensor may be located remotely from the image sensor and may be a significant physical distance from the image sensor. When the control unit is far from the sensor, it may be desirable to transmit the data in the digital domain, as the digital domain is largely immune to interference noise and signal degradation when compared to transmitting a stream of analog data. It will be recognized that various electrical digital signage standards may be used, for example LVDS (Low Voltage Differential Signaling), sub-LVDS, SLVS (Scalable Low Voltage Signaling) or other electrical digital signage standards. [0067] [0067] There may be a strong desire to minimize the number of electrical conductors to reduce the number of blocks that consume space in the sensor, as well as to reduce the complexity and cost of sensor production. While adding analog-to-digital conversion to the sensor can be advantageous, the additional area taken up by the conversion circuitry is compensated for by the significant reduction in analog buffer power required due to early conversion to a digital signal. [0068] [0068] In terms of area consumption, given the typical feature size available in CMOS image sensor (CIS) technologies, it may be preferable in some implementations to have all internal logic signals generated on the same integrated circuit as the pixel array. through a set of control registers and a simple command interface. [0069] [0069] Some implementations of the description may include aspects of a combined sensor and system design that allow high definition imaging with low pixel counts in a highly controlled lighting environment. This can be accomplished by pulsing frame by frame of a single color wavelength and switching or alternating each frame between a single and different color wavelength using a controlled light source combined with high frame capture rates and a specially designed matching monochrome sensor. Additionally, electromagnetic radiation outside the visible light spectrum can be pulsed to allow the generation of a hyperspectral image. Pixels can be color agnostic, so each pixel can generate data for every pulse of electromagnetic radiation, including pulses for red, green, and blue wavelengths of visible light with other wavelengths that can be used for the hyperspectral imaging. [0070] [0070] As used in the present invention, monochrome sensor refers to an unfiltered imaging sensor. Since pixels are color agnostic, the effective spatial resolution is appreciably higher than their color counterparts (typically filtered in a Bayer pattern) in conventional single-sensor cameras. They may also have higher quantum efficiency since far fewer incident photons are wasted between individual pixels. Furthermore, Bayer-based spatial color modulation requires that the modulation transfer function (MTF) of the accompanying optical elements be reduced compared to monochromatic modulation, in order to blur the color artifacts associated with the pattern. Bayer. This has a detrimental impact on the actual spatial resolution that can be realized with color sensors. [0071] [0071] The description also refers to a system solution for endoscopic applications in which the image sensor resides at the distal end of the endoscope. When looking for a sensor-based system with minimal area, there are other design aspects that can be developed besides reducing the pixel count. The area of the digital portion of the integrated circuit can be minimized. Furthermore, the number of connections to the integrated circuit (blocks) can also be minimized. The description describes new methods that achieve these goals for the realization of this system. This involves designing a fully customized CMOS image sensor with a number of new features. [0072] [0072] In order to promote an understanding of the principles in accordance with the description, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe them. However, it will be understood that no limitation on the scope of the description is intended herein. Any further changes and modifications to the features of the invention illustrated in the present invention, as well as any further applications of the principles of the description as illustrated in the present invention, which would normally occur to those skilled in the art and in possession of the present description, should be considered within the scope of the present invention. claimed description. [0073] [0073] Before the structure, systems and methods for producing an image in a light-deficient environment are described and described, it should be understood that the present description is not limited to the structures, configurations, process steps and materials in particular described. in the present invention, as these structures, configurations, process steps and materials may vary somewhat. It is also understood that the terminology employed herein is used for the purpose of describing particular embodiments only and is not intended to be limiting, as the scope of the description will be limited only by the appended claims and equivalents thereof. [0074] [0074] In describing and claiming the subject matter of the description, the following terminology will be used in accordance with the definitions described below. [0075] [0075] It should be noted that as used in this specification and the appended claims, the singular forms "a", "an", "the" and "the" include the respective plural forms, the unless the context clearly dictates otherwise. [0076] [0076] As used in the present invention, the terms "comprising," "including," "containing," "characterized by," and grammatical equivalents thereof are inclusive or open-ended terms that do not exclude additional elements or method steps not mentioned. [0077] [0077] As used in the present invention, the phrase "consisting of" and grammatical equivalents thereof exclude any element or step not specified in the claim. [0078] [0078] As used in the present invention, the expression "consisting essentially of" and the grammatical equivalents thereof limit the scope of a claim to the specific materials or steps and those that do not materially affect the basic and novel features or characteristics of the claimed description. [0079] [0079] As used in the present invention, the term "proximal" shall broadly refer to the concept of a portion with the closest proximity to an origin. [0080] [0080] As used in the present invention, the term "distal" should generally refer to the opposite of proximal, and thus the concept of a portion furthest from an origin or a portion with the maximum distance, depending on the context. [0081] [0081] As used in the present invention, color sensors or multi-spectrum sensors are those sensors known to have a color filter array (CFA - "color filter array") to filter incoming electromagnetic radiation into its components. separated. In the visual range of the electromagnetic spectrum, this CFA can be constructed in a Bayer pattern or modification to separate the green, red and blue spectrum components of light. [0082] [0082] Now with reference to Figures 1 to 5, the systems and methods for producing an image in a light-deficient environment will now be described. Figure 1 illustrates a schematic view of a paired sensor and an electromagnetic emitter in operation for use in producing an image in a light-deficient environment. This setting allows for greater functionality in a controlled light environment or in a light-deficient environment. [0083] [0083] It should be noted that, as used in the present invention, the term "light" is both a particle and a wavelength and is intended to denote electromagnetic radiation that is detectable by a pixel array and may include wavelengths at from the visible and non-visible spectra of electromagnetic radiation. The term "partition" is used in the present invention to mean a predetermined range of wavelengths of the electromagnetic spectrum that are less than the entire spectrum, or in other words, wavelengths that were some portion of the electromagnetic spectrum. As used in the present invention, an emitter is a source of light that can be controllable with respect to the portion of the electromagnetic spectrum that is emitted or that can operate according to the physics of its components, the intensity of the emissions, or the duration of the emission, or all of the above. An emitter can emit light in any uncertain, diffuse or collimated emission and can be controlled digitally or by analog methods or systems. As used in the present invention, an electromagnetic emitter is a source of a burst of electromagnetic energy and includes light sources such as lasers, LEDs, incandescent light, or any light source that can be digitally controlled. [0084] [0084] A pixel array of an image sensor can be electronically paired with an emitter, so that they are synchronized during operation both for receiving emissions and for adjustments made within the system. As can be seen in Figure 1, an emitter 100 can be set to emit electromagnetic radiation in the form of a laser that can be pulsed to illuminate an object 110. The emitter 100 can pulse at an interval that corresponds to the operation and functionality of a device. pixel array 122. Emitter 100 can pulse light into a plurality of electromagnetic partitions 105, such that the pixel array receives electromagnetic energy and produces a data set that corresponds (in time) to each specific electromagnetic partition 105. For example , Figure 1 illustrates a system having a monochrome sensor 120 having a pixel array (black and white) 122 and supporting circuitry wherein the pixel array 122 is sensitive to electromagnetic radiation of any wavelength. The light emitter 100 illustrated in the figure may be a laser emitter capable of emitting a red electromagnetic partition 105a, a blue electromagnetic partition 105b and a green electromagnetic partition 105c in any desired sequence. In one embodiment where a hyperspectral image can be generated, the light emitter 100 can pulse electromagnetic radiation at any wavelength in the electromagnetic spectrum such that a hyperspectral image can be generated. It will be recognized that other light emitters 100 may be used in Figure 1 without departing from the scope of the description, for example digital or analog based emitters. [0085] [0085] During operation, a specific color or wavelength partition can be assigned to the data created by the monochrome sensor 120 for any individual pulse, with the assignment being based on the time of the pulsed color or wavelength partition coming from of the emitter 100. Although the pixels [0086] [0086] In an exemplary embodiment of the description, the emitter 100 pulses electromagnetic radiation at specialized wavelengths. These pulses may be able to generate a specialized fluorescence image that is particularly suitable for certain medical or diagnostic applications. In the exemplary embodiment, at least a portion of the electromagnetic radiation emitted by the emitter 100 includes an excitation wavelength of electromagnetic radiation between 770 nm and 790 nm and between 795 nm and 815 nm that causes one or more reagents to fluoresce in a wavelength that is different from the excitation wavelength of the portion of electromagnetic radiation. [0087] [0087] In one embodiment, three sets of data representing RED, GREEN and BLUE electromagnetic pulses can be combined to form a single image frame. One or more additional datasets representing other wavelength partitions can be superimposed on the single image frame which is based on the RED, GREEN and BLUE pulses. The one or more additional data sets may represent, for example, fluorescence imaging responsive to the excitation wavelength between 770 nm and 790 nm and between 795 nm and 815 nm. The one or more additional datasets can represent fluorescence and/or hyperspectral imaging that can be superimposed on the single image frame that is based on the RED, GREEN and BLUE pulses. [0088] [0088] It will be recognized that the description is not limited to any particular color combination or any particular electromagnetic partition and that any color combination or any electromagnetic partition may be used in place of RED, GREEN and BLUE such as Cyan, Magenta and yellow; Ultraviolet; infra-red; any combination of the foregoing or any other color combination, including all visible and non-visible wavelengths, without departing from the scope of the description. In the figure, the object 110 to be imaged contains a red portion 110a, a green portion 110b and a blue portion 110c. As illustrated in the figure, the light reflected from the electromagnetic pulses contains only the data for the portion of the object having the specific color that corresponds to the pulsed color partition. These separate color (or color range) data sets can then be used to reconstruct the image by combining the data sets into 130. [0089] [0089] In one embodiment, a plurality of datasets representing RED, GREEN and BLUE electromagnetic pulses with additional wavelength partitions along the electromagnetic spectrum can be combined to form a single image frame having an RGB image with data hyperspectral image superimposed on the RGB image. Depending on the application or instance, different combinations of wavelength datasets may be desirable. For example, in some implementations, a dataset representing specific wavelength partitions can be used to generate a specialized hyperspectral image to diagnose a particular clinical condition, investigate certain body tissues, and so on. [0090] [0090] As illustrated in Figure 2, implementations of the present description may comprise or utilize a special or general purpose computer, including computer hardware, such as, for example, one or more processors and system memory, as discussed in more detail. below. Implementations within the scope of this description may also include physical readable media and other computer readable media for loading or storing computer executable instructions and/or data structures. Such computer-readable media may be any available media that can be accessed by a general-purpose or special-purpose computer system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer-executable instructions are broadcast media. Thus, by way of example rather than limitation, implementations of the description may comprise at least two distinctly different types of computer-readable media: computer storage media (devices) and transmission media. [0091] [0091] Computer storage media (devices) include RAM, ROM, EEPROM, CD-ROM, solid state drives ("SSDs" - "solid state drives") (e.g. based on RAM), Flash memory, phase-change memory ("PCM" - "bhase-change memory"), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other media that may be used to store means of desired program code in the form of computer-executable instructions or data structures and that can be accessed by a general-purpose or special-purpose computer. [0092] [0092] A "network" is defined as one or more data links that allow the transport of electronic data between computer systems and/or modules and/or other electronic devices. In one implementation, a sensor and camera control unit can be networked to communicate with each other and with other network-connected components to which they are connected. When information is transferred or provided over a network or other communications connection (either wired, wireless, or a combination of wired and wireless) to a computer, the computer properly views the connection as a transmission medium. The transmission media may include a network and/or data links that can be used to contain desired media of program code in the form of computer-executable instructions or data structures and that can be accessed by a general-purpose computer or special purpose. Combinations of the above must also be included in the scope of computer readable media. [0093] [0093] Additionally, by targeting various computer system components, program code means in the form of computer-executable instructions or data structures can be automatically transferred from transmission media to computer storage media (devices) (or vice versa). For example, computer-executable instructions or data structures received over a network or data link may be temporarily stored in RAM within a network interface module (e.g., a "NIC") and then finally transferred to computer system RAM and/or to less volatile computer storage media (devices) in a computer system. RAM can also include solid-state drives (SSDs or tiered storage in PClx-based real-time memory such as FusionlO). Thus, it should be understood that computer storage media (devices) may be included in computer system components that also (or even primarily) utilize transmission media. [0094] [0094] Computer-executable instructions comprise, for example, instructions and data that, when executed on a processor, cause a general-purpose computer, special-purpose computer or special-purpose processing device to perform a certain function or function group. Computer executable instructions can be, for example, intermediate format binary instructions, for example assembly language, or even source code. While this matter has been described in language specific to structural features and/or methodological actions, it should be understood that the subject matter defined in the attached claims is not necessarily limited to the features described or actions described above. Instead, the features and actions described are described as example ways of implementing the claims. [0095] [0095] Those skilled in the art will appreciate that the description can be practiced in network computing environments with various types of computer system configurations, including personal computers, desktop computers, laptop computers, message processors, control unit, camera control unit, handheld devices, handpieces, multiprocessor systems, microprocessor-based or programmable consumer electronic devices, networked PCs, minicomputers, mainframe computers, mobile phones, PDAs, tablets, pagers , routers, switches, various storage devices, and the like. It should be noted that any of the computing devices mentioned above may be provided by or located within a brick and mortar. The description can also be put into practice in distributed system environments where local and remote computer systems are linked (either by wired data links, wireless data links, or by a combination of wired and wireless data links) over a network, and both perform tasks. In a distributed system environment, program modules can be located on both local and remote memory storage devices. [0096] [0096] Additionally, where appropriate, the functions described in the present invention can be performed in one or more of: hardware, software, firmware, digital components or analog components. For example, one or more application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) can be programmed to perform one or more of the systems and procedures described in the present invention. Certain terms are used throughout the description and claims that follow to refer to particular system components. As those skilled in the art will understand, components can be identified by different names. This document is not intended to distinguish between components that differ in name but not in function. [0097] [0097] Figure 2 is a block diagram illustrating an example computing device 150. Computing device 150 can be used to perform various procedures such as those discussed in the present invention. Computing device 150 may function as a server, a client, or any other computing entity. Computing device 150 may perform various monitoring functions as discussed in the present invention, and may run one or more application programs, such as the application program described in the present invention. The computing device 150 can be any of a wide variety of computing devices, such as a desktop computer, a notebook computer, a server computer, a portable computer, a camera control unit, a tablet computer. and the like. [0098] [0098] Computing device 150 includes one or more processor(s) 152, one or more memory device(s) 154, one or more interface(s) 156, one or more mass storage device(s) 158 , one or more input/output (1/O) device(s) 160, and a display device 180, all of which are coupled to a bus 162. The processor(s) 152 include one or more processors or controllers that execute instructions stored in memory device(s) 154 and/or mass storage device(s) 158. Processor(s) 152 may also include various types of computer-readable media, as cached memory. [0099] [0099] The memory device(s) 154 include, in addition to, various computer-readable media, such as volatile memory (e.g., random access memory (RAM) 164) and/or non-volatile memory (e.g., memory only (ROM) 166). The memory device(s) 154 may also include rewritable ROM, such as Flash memory. [00100] [00100] Mass storage device(s) 158 include various computer-readable media such as magnetic tapes, magnetic disks, optical disks, solid-state memory (eg, memory), etc. As shown in Figure 2, a particular mass storage device is a hard disk drive 174. Multiple drives can also be included in the mass storage device(s) 158 to allow reading from I/O. or recording on various computer readable media. Mass storage device(s) 158 include(s) removable media 176 and/or non-removable media. [00101] [00101] I/O device(s) 160 include(s) various devices that allow data and/or other information to be entered into or retrieved from computing device 150. Examples of 1/O devices 160 include digital imaging devices, [00102] [00102] Display device 180 includes any type of device capable of displaying information to one or more users of computing device 150. Examples of display device 180 include a monitor, monitor terminal, video projection device, and the like. . [00103] [00103] Interface(s) 106 include several interfaces that allow a computing device 150 to interact with other systems, devices, or computing environments. Example(s) of 156 interface may include any number of different 170 network interfaces, such as interfaces for local area networks (LANs), wide area networks (WANs), wireless networks, and the Internet. Other interface(s) include(s) user interface 168 and peripheral device interface 172. Interface(s) 156 may also include one or more user interface elements 168. A( s) interface(s) 156 may also include one or more peripheral interfaces, such as interfaces for printers, pointing devices (mouses, track pads, etc.), keyboards, and the like. [00104] [00104] Bus 162 allows processor(s) 152, memory device(s) 154, interface(s) 156, mass storage device(s) 158, and I/O device(s) 160 to communicate with each other as well as with other devices or components attached to bus 162. Bus 162 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE 1394 bus, USB bus, and so on. [00105] [00105] For purposes of illustration, programs and other executable program components are shown in the present invention as discrete blocks, although it is understood that these programs and components may reside at different times in different storage components of the computing device 150, and are executed by the processor(s) 152. Alternatively, the systems and procedures described in the present invention may be implemented in hardware, or in a combination of hardware, software, and/or firmware. For example, one or more application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) can be programmed to perform one or more of the systems and procedures described in the present invention. [00106] [00106] Figure 2A illustrates the operating cycles of a sensor used in scroll reading mode or during sensor reading [00107] [00107] Figure 3 graphically illustrates the operation of a modality of an electromagnetic emitter. An emitter can be timed to match the cycles of a sensor so that electromagnetic radiation is emitted within the sensor's operating cycle and/or during a portion of the sensor's operating cycle. Figure 3 illustrates Pulse 1 at 302, Pulse 2 at 304, and Pulse 3 at 306. In one embodiment, the emitter may pulse during the 202 reading portion of the sensor's operating cycle. In one embodiment, the emitter may pulse during the blanking portion 216 of the sensor's operating cycle. In one embodiment, the emitter may pulse for a period that occurs during parts of two or more sensor operating cycles. In one embodiment, the sender may initiate a pulse during the blanking part 216 or during the black optics 220 of the read part 202, and terminate the pulse during the read part 202, or during the black optics 218 of the read part 202. reading 202 of the next subsequent cycle. It will be understood that any combination of the foregoing shall remain within the scope of the present description as long as the emitter pulse and sensor cycle match. [00108] [00108] Figure 4 plots the variation in duration and magnitude of the emitted electromagnetic pulse (eg Pulse 1 at 402, Pulse 2 at 404, and Pulse 3 at 406) to control exposure. An emitter having a fixed output magnitude may be pulsed during any of the cycles mentioned above with respect to Figures 2D and 3 for an interval to supply the necessary electromagnetic energy to the pixel array. An emitter having a fixed output magnitude can be pulsed in a longer time interval, thus delivering more electromagnetic energy to the pixels, or the emitter can be pulsed in a shorter time interval, thus providing less electromagnetic energy. The need for a longer or shorter time interval depends on operating conditions. [00109] [00109] In contrast to adjusting the time interval that the emitter pulses a fixed output magnitude, the emission magnitude itself can be increased to deliver more electromagnetic energy to the pixels. Similarly, reducing the magnitude of the pulse delivers less electromagnetic energy to the pixels. It should be noted that a system modality may have the ability to simultaneously adjust both magnitude and duration, if desired. Additionally, the sensor can be adjusted to increase its sensitivity and duration as desired for optimal image quality. Figure 4 illustrates the variation in the magnitude and duration of the pulses. In the illustration, Pulse 1 at 402 has a greater magnitude or intensity than Pulse 2 at 404 or Pulse 3 at 406. Additionally, Pulse 1 at 402 has a shorter duration than Pulse 2 at 404 or Pulse 3 at 406 so that the electromagnetic energy supplied by the pulse is illustrated by the area under the pulse shown in the illustration. In the illustration, Pulse 2 at 404 has a relatively low magnitude or intensity and a longer duration compared to Pulse 1 at 402 or Pulse 3 at 406. Finally, in the illustration, Pulse 3 at 406 has a magnitude or intermediate intensity and duration compared to Pulse 1 at 402 and Pulse 2 at 404. [00110] [00110] Figure 5 is a graphic representation of a description mode that combines the operating cycles, the electromagnetic emitter and the electromagnetic pulses emitted from Figures 2 to 4 to demonstrate the imaging system during operation according to the principles and teachings of the description. As can be seen from the figure, the electromagnetic emitter pulses emissions primarily during the sensor blanking period 216, so that the pixels will be loaded and ready for reading during the reading portion 202 of the sensor cycle. The dashed line portions of the pulse (from Figure 3) illustrate the potential or ability to emit electromagnetic energy during the black optics 220 and 218 of the read cycle (sensor cycle) 200 if additional time is required or desired to pulse energy electromagnetic. [00111] [00111] Now referring to Figures 6 to 9A, Figure 6 illustrates a schematic of two distinct processes over a time period of t(0) at(1) to record a frame of video for full spectrum light and partitioned spectrum. It should be noted that color sensors have a color filter array (CFA) to filter certain wavelengths of light per pixel commonly used for full spectrum light reception. An example of a CFA is a Bayer standard. Since the color sensor can comprise pixels within the matrix that are sensitized to a single color within the full spectrum, the result is a reduced resolution image as the pixel matrix has pixel spaces dedicated to only a single color. of light within the full spectrum. Typically, such an arrangement is formed in a checkerboard-like pattern across the entire array. [00112] [00112] In contrast, when partitioned spectra of light are used, a sensor may be sensitized or responsive to the magnitude of all light energy, as the pixel array will be instructed that it is detecting electromagnetic energy from a predetermined partition of the full spectrum of light. electromagnetic energy in each cycle. Therefore, to form an image, the sensor only needs to cycle through a plurality of differentiating partitions within the full spectrum of light, and then reassemble the image to display a predetermined mixture of color values for each pixel along the spectrum. headquarters. Consequently, a higher resolution image is also provided as there are reduced distances compared to a Bayer sensor between pixel centers of the same color sensitivity for each of the color pulses. As a result, the color image formed has a higher modulation transfer function (MTF - "modulation transfer function"). Since the image of each color partition frame cycle has a higher resolution, the resulting image created when the partitioned light frames are combined into a full color frame also has a higher resolution. In other words, since each and every pixel within the array (rather than, at most, every second pixel on a color-filtered sensor) is detecting the energy magnitudes of a given pulse and a given scene, with only split-time separation, a higher resolution image is created for each scene, requiring less derived (less accurate) data to be input. [00113] [00113] For example, white or full-spectrum visible light is a combination of red, green, and blue light. In the embodiment shown in Figure 6, it can be seen that in both the partitioned spectrum process 620 and the full spectrum process 610, the time to capture an image varies from t(0) to t(1). In the full-spectrum process 610, white light or full-spectrum electromagnetic energy is emitted at 612. At 614, white or full-spectrum electromagnetic energy is detected. At 616, the image is processed and displayed. Thus, between time t(0) and t(1), the image was processed and displayed. Conversely, in the split spectrum process 620, a first partition is emitted at 622 and detected at 624. At 626, a second partition is emitted and then detected at 628. At 630, a third partition is emitted and detected at 632. At 634, the image is processed and displayed. It will be recognized that any system utilizing an image sensor cycle that is at least twice as fast as the white light cycle should fall within the scope of the description. [00114] [00114] As can be seen graphically in the mode illustrated in Figure 6 between times t(0) and t(1), the sensor for the partitioned spectrum system 620 has been cycled three times for each of the full spectrum system. In the 620 partitioned spectrum system, the first of three sensor cycles is for a 622 and 624 green spectrum, the second of three is for a 626 and 628 red spectrum, and the third is for a 630 and 632 blue spectrum. in an embodiment where the display device (LCD panel) operates at 50-60 frames per second, a partitioned light system must operate at 150-180 frames per second to maintain the continuity and integrity of the displayed video. [00115] [00115] In other modes, there may be different capture and display frame rates. Also, the average capture rate could be any multiple of the display rate. [00116] [00116] In one embodiment, it may be desired that not all partitions are represented equally within the system frame rate. In other words, not all light sources need to be pulsed with the same regularity in order to emphasize and de-emphasize aspects of the recorded scene as desired by users. It should also be understood that the non-visible and visible partitions of the electromagnetic spectrum can be pulsed together within a system with their respective data value being stitched into the video output as desired for display to a user. [00117] [00117] An embodiment may comprise a pulse cycle pattern as follows: i. green pulse; ki. red pulse; iii. blue pulse; iv. green pulse; v. red pulse; saw. blue pulse; vii. Infrared pulse (IR); vile. (Repeat) [00118] [00118] As can be seen from the example, an infrared partition or a specialized wavelength partition (e.g. 513 to 545 nm, 565 to 585 nm and/or 900 to 100 nm) can be pulsed at a rate that differs of the rates of the other partition pulses. This can be done to emphasize a certain aspect of the scene, with the |V data simply being superimposed on the other data in the video output to achieve the desired emphasis. It should be noted that adding an electromagnetic partition on top of the red, green, and blue partitions does not necessarily require the serial system to operate at four times the rate of a full-spectrum non-serial system, as each partition need not be represented equally. in the pulse pattern. As seen in the embodiment, the addition of a partition pulse that is represented less in a pulse pattern (infrared in an example above), would result in a less than 20% increase in sensor cycle speed in order to accommodate sampling of irregular partition. [00119] [00119] In one embodiment, an electromagnetic partition can be emitted that is sensitive to dyes or materials that are used to highlight aspects of a scene. In the embodiment, it may be sufficient to highlight the location of dyes or materials without the need for high resolution. In this embodiment, the dye-sensitive electromagnetic partition can cycle much less frequently than the other partitions in the system to include the emphasized data. [00120] [00120] In various embodiments, the pulse cycle pattern may include any of the following wavelengths in any suitable order. These wavelengths may be particularly suitable for determining multispectral or hyperspectral image data or for determining image data based on a fluorescent reagent relaxation emission: i. 465+5nNM; ii. 533+4nM; ii. 638+5 nm; iv. 780+5 nm; v. 805t+5nm; saw. 9/5t+5nm; vii. 577 +2nm;or vill. 523 +4 nm. [00121] [00121] —Partition cycles can be split to accommodate or approximate various imaging and video standards. In one embodiment, the partition cycles may comprise pulses of electromagnetic energy in the Red, Green and Blue spectrum as best illustrated in Figures 7A through 7D. In Figure 7A, the different light intensities were achieved by modulating the width or duration of the light pulse within the working range shown by the vertical gray dashed lines. In Figure 7B, the different light intensities were achieved by modulating the light power or the power of the electromagnetic emitter, which can be a laser or LED emitter, but keeping the pulse width or duration constant. Figure 7C shows the case where both light power and light pulse width are being modulated, leading to greater flexibility. Partition cycles can employ CMY, IR and ultraviolet with the use of a non-visible pulse source mixed with visible pulse sources and any other color space required to produce an image or approximate a desired video standard that is currently known or yet to be developed. It should also be understood that a system may be able to switch between color spaces during progress to provide the desired image output quality. [00122] [00122] In a modality that uses Green-Blue-Green-Red color spaces (as seen in Figure 7D), one may wish to pulse the luminance components more often than the chrominance components, as users are so generally more sensitive to differences in light magnitude than differences in light color. This principle can be exploited using a monochromatic sensor as illustrated in Figure 7D. In Figure 7D, the green color, which contains most of the luminance information, can be pulsed more frequently or harder in a scheme (G-B-G-R-G-B-G-R...) to obtain the luminance data. This setting would create a video stream that has noticeably more detail, without creating and transmitting imperceptible data. [00123] [00123] In one embodiment, doubling the pulse from a weaker partition can be used to produce an emission that has been tuned for the weaker pulse. For example, blue laser light is considered weak relative to the sensitivity of silicon-based pixels and is difficult to produce compared to red or green light and therefore may be pulsed more frequently during a frame cycle to compensate for the weakness of light. These additional pulses can be performed in series over time or by using multiple lasers that pulse simultaneously to produce the desired compensation effect. It should be noted that when pulsing during a blanking period (time during which the sensor is not reading the pixel array, the sensor is insensitive to differences/disparities between lasers of the same type and simply accumulates light for the desired emission. In another embodiment, the maximum light pulse range may be different from frame to frame. This is shown in Figure 7E where the light pulses are different from frame to frame. The sensor can be built to be able to program different blanking times with a repeating pattern of 2 or 3 or 4 or n frames. In Figure 7E, 4 different light pulses are illustrated, and Pulse 1 can repeat, for example, after Pulse 4 and can have a 4-frame pattern with different blanking times. This technique can be used to put the most powerful partition in the shortest blanking time and therefore allow the weaker partition to have a wider pulse in one of the next frames without the need to increase the speed. and reading. The reconstructed frame may still have a regular frame-to-frame pattern as it is made up of many pulsed frames. [00124] [00124] “As can be seen in Figure 8, since each partitioned spectrum of light can have different energy values, the sensor and/or light emitter can be adjusted to compensate for differences in energy values. At 810, data obtained from the histogram of a previous frame can be analyzed. In 820, the sensor can be adjusted as mentioned below. Additionally, at 830, the emitter can be set. At 840, the image may be taken from the adjusted sample time from the sensor or the image may be taken with adjusted emitted light (either increased or decreased), or a combination of the above. For example, since the red light spectrum is more readily detected by a sensor within the system than the blue light spectrum, the sensor can be adjusted to be less sensitive during the red partition cycle and more sensitive during the red partition cycle. of blue partition due to the low Quantum Efficiency that the blue partition has in relation to silicon (better illustrated in Figure 9). Similarly, the emitter can be adjusted to provide an adjusted partition (eg, greater or lesser intensity and duration). Additionally, adjustments can be made at both the sensor and emitter level. The emitter can also be designed to emit at a specific frequency or can be altered to emit multiple frequencies from a specific partition to broaden the spectrum of light being emitted, if desired, for a particular application. [00125] [00125] Figure 10 shows a schematic of an unshared 4T pixel. The TX signal is used to transfer accumulated charges from the photodiode (PPD) to the oscillating diffusion (FD). The reset signal is used to reset the FD to the reset bus. If both the reset and TX signals are "on" at the same time, the PPD is constantly reset (each photo load generated in the PPD is directly collected on the reset bus) and the PPD is always empty. The usual pixel array implementation includes a horizontal reset line that sets the reset signals of all pixels within a row and a horizontal TX line that sets the TX signals of all pixels within a row. [00126] [00126] In one embodiment, the sensor sensitivity adjustment time is illustrated and the sensor sensitivity adjustment can be achieved using a global reset mechanism (i.e. a means of triggering all matrix reset signals at once) and a global TX engine (i.e., means for triggering all TX signals from the pixel array at once). This is shown in Figure 11. In this case, the light pulse is constant in terms of duration and amplitude, but the light integrated in all pixels starts with the transition from "on" to "off" of the global TX and ends with the light pulse. Therefore, modulation is performed by moving the falling edge of the global TX pulse. [00127] [00127] — On the other hand, the emitter can emit red light at a lower intensity than blue light to produce a correctly exposed image (better illustrated in Figure 12). At 1210, data obtained from the histogram of a previous frame can be analyzed. At 1220, the emitter can be adjusted. At 1230, the image can be obtained from the adjusted emitted light. Additionally, in one embodiment, both the emitter and the sensor can be simultaneously adjusted. [00128] [00128] —Reconstructing the partitioned spectrum frames into a full spectrum frame for later output could be as simple as mixing the detected values for each pixel in the matrix in some modalities. Additionally, the blend and blend values can be simple averages or can be adjusted in a predetermined "lookup table" of values for desired emissions. In one embodiment of a system using partitioned light spectra, the detected values can be post-processed or further refined remotely from the sensor by an image or secondary processor, and shortly before being sent to a monitor. [00129] [00129] Figure 13 illustrates a basic 1300 example of a monochromatic ISP and how an ISP chain can be assembled for the purpose of generating sRGB image sequences from raw sensor data generated in the presence of the light pulse scheme GR-GB. [00130] [00130] The first stage refers to making corrections (see 1302, 1304 and 1306 in Figure 13) to take into account any non-idealities in sensor technology for which it is more appropriate to work in the raw data domain (see Figure 21). [00131] [00131] In the next stage, two frames (see 1308 and 1310 in Figure 13) would be temporarily stored as each final frame derives data from three raw frames. The reconstruction of the frame at 1314 would continue by sampling data from the current frame and the two temporarily stored frames (1308 and/or 1310). The reconstruction process results in full color frames in linear RGB color space. [00132] [00132] In this example, the white balance coefficients at 1318 and the color correction matrix at 1320 are applied before converting to YCbCr space at 1322 for subsequent edge enhancement at 1324. After edge enhancement at 1324, the images are transformed back to linear RGB at 1326 for scaling at 1328, if applicable. [00133] [00133] Finally, the gamma transfer function at 1330 would be applied to translate the data to the SsRGB domain at 1332. [00134] [00134] Figure 14 is an example of color fusion hardware modality. The color fusion hardware receives an RGBGRGBGRGBG video data stream at 1402 and converts it to a parallel RGB video data stream at 1405. The bit width on the input side can be, for example, 12 bits per color . The output width for this example would be 36 bits per pixel. Other embodiments may have different initial bit widths and 3 times that number for the output width. Does the memory writer block take the RGBG video stream at 1402 as its input and writes each frame to its correct frame memory buffer at 1404 (the memory writer triggers the same pulse generator 1410 that runs the laser light source). As illustrated in 1404, memory writing follows the pattern, Red, Green 1, Blue, Green 2, then resets to Red again. At 1406, the memory reader reads three frames at once to build an RGB pixel. Each pixel is three times the bit width of an individual color component. The reader also triggers the laser pulse generator at 1410. The reader waits until the Red, Green 1 and Blue frames have been written, then proceeds to read them in parallel while the writer continues to write Green 2 and restarts at Red . When Red ends, the reader starts reading from Blue, Green 2 and Red. This pattern continues indefinitely. [00135] [00135] Now referring to Figures 15 and 16, the standard reconstruction RG1BG2RG1BG illustrated in Figure 16 allows 60 fps output with 120 fps input in one mode. Each consecutive frame contains both a red and a blue component of the previous frame. In Figure 16, each color component is available in 8.3 ms and the resulting reconstructed frame has a period of 16.67 ms. Generally speaking, for this heartbeat scheme, the reconstructed frame has a period of twice that of the input color frame as shown in Figure 15. In other embodiments, [00136] [00136] Figures 17 to 20 illustrate color correction methods and hardware schemes for use with a partitioned light system. It is common in digital imaging to manipulate the values within the image data to correct the output to meet user expectations or to highlight certain aspects of the imaged object. This is most commonly performed on satellite images that are tuned and tuned to emphasize one type of data over another. More often, in data acquired by satellite, the full spectrum of electromagnetic energy is available, since the light source is not controlled, ie the sun is the light source. In contrast, there are imaging conditions where light is controlled and still provided by a user. In these situations, calibration of image data is still desirable, as without calibration, undue emphasis can be placed on certain data over other data. In a system where the light is controlled by the user, it is advantageous to provide light emissions which are known to the user and which may be only a portion of the electromagnetic spectrum or a plurality of portions of the full electromagnetic spectrum. Calibration remains important to meet user expectations and check for faults within the system. A calibration method can be a table of expected values for a given imaging condition that can be compared to sensor data. One embodiment may include a color-neutral scene having known values that must be sent by the imaging device, and the device may be adjusted to meet these known values when the device samples the color-neutral scene. [00137] [00137] In use, and when :started, the system can sample a color-neutral scene at 1710 (as illustrated in Figure 17) by performing a full cycle of a plurality of electromagnetic spectrum partitions at 1702. A table of 1708 values can be formed to produce a histogram for the frame at 1704. The frame values can be compared with the known or expected values of the color-neutral scene at 1706. The imaging device can then be adjusted to meet the desired output at 1712. In In an embodiment illustrated in Figure 17, the system may comprise an image signal processor (ISP - "image signal processor") which can be adjusted to correct the color of the imaging device. [00138] [00138] It should be noted that as each partitioned spectrum of light can have different energy values, the sensor and/or light emitter can be adjusted to compensate for differences in energy values. For example, in one embodiment, since the blue light spectrum has a lower quantum efficiency than the red light spectrum relative to silicon-based imagers, the sensor's responsiveness can then be adjusted to be less responsive during the cycle. red and more responsive during the blue cycle. On the other hand, the emitter can emit blue light at a higher intensity, due to the lower quantum efficiency of blue light, than red light to produce a correctly exposed image. [00139] [00139] In an embodiment illustrated in Figure 18, where the light source emissions are provided and controllable by the system, adjustment of these light emissions can be made to correct the color of an image by 1800. Adjustments can be made on any aspect of the emitted light, eg magnitude, duration (i.e. time it remains on) or the range within the spectrum partition. Additionally, both the emitter and the sensor can be adjusted simultaneously in some modes as shown in Figure 19. [00140] [00140] To reduce the amount of noise and artifacts within the sent image stream or video, fractional adjustments can be made to the sensor or emitter within the system as seen in Figure 20. A 2000 system is illustrated in Figure 20 in which both the emitter 2006 and the sensor 2008 can be adjusted, but an imaging device in which the emitter or sensor is adjusted during use or during a part of use is also contemplated and is within the scope of the present description. It may be advantageous to adjust only the emitter during a part of the use and adjust only the sensor during another portion of the use, while additionally still adjusting both simultaneously during a part of the use. In any of the above embodiments, improved image quality can be achieved by limiting the general adjustments the system can make between frame cycles. In other words, a modality can be limited so that the sender can only have a fraction of its operating range set at any time between frames. Similarly, the sensor can be limited so that it can only adjust a fraction of its operating range at any given time between frames. In addition, both the emitter and sensor can be limited so that they can only be tuned together in a fraction of their respective operating ranges at any time between frames in a modality. [00141] [00141] In an exemplary embodiment, a fractional adjustment of the components within the system can be performed, for example, at about 0.1 dB of the components' operating range to correct the exposure of the previous frame. The value of 0.1 dB is merely an example and it should be noted that, in other embodiments, the allowable adjustment of the components may be any portion of their respective operating ranges. System components can be changed by adjusting the intensity or duration which is generally governed by the number of bits (resolution) sent by the component. Component resolution can typically be in the range of about 10 to 24 bits, but should not be limited to this range as it should include resolutions for components that have yet to be developed, in addition to those that are currently available. For example, after a first frame, the scene is determined to be very blue when viewed, so the emitter can be adjusted to reduce the magnitude or duration of the blue light pulse during the system's blue cycle through a fractional adjustment as discussed above, for example about 0.1 dB. [00142] [00142] “In this exemplary modality, more than 10 percent may have been necessary, but the system was limited to adjusting 0.1 dB of the operating range per system cycle. Consequently, during the next system cycle, the blue light can then be adjusted again if necessary. Fractional adjustment between cycles can have a dampening effect on the emitted images and will reduce noise and artifacts when operating the emitters and sensors at their extreme operations. It may be determined that any fractional amount of the component tuning operating range may be used as a limiting factor, or it may be determined that certain embodiments of the system may comprise components that can be tuned over their entire operating range. [00143] [00143] Additionally, the optical black area of any image sensor can be used to assist in image correction and noise reduction. In one embodiment, the values read from the black optical area can be compared to those from the active pixel region of a sensor to establish a reference point to be used in processing image data. Figure 21 shows the type of sensor correction processes that can be employed in a pulsed color system. CMOS image sensors typically have multiple non-idealities that have a detrimental effect on image quality, particularly in low light. Chief among them are fixed pattern noise and line noise. Fixed pattern noise is a scatter across the displacements of these detection elements. Typically, most FPNs are pixel-to-pixel scattering that originates, among other sources, from random variations in dark current from photodiode to photodiode. This looks quite abnormal to the observer. Even more notable is the FPN column, which results from shifts in the reading chain associated with particular pixel columns. This results in noticeable vertical bands within the image. [00144] [00144] Having full control of lighting has the benefit that all black data frames can be periodically acquired and used to correct for pixel and column offsets. In the illustrated example, a single-frame buffer can be used to obtain a moving average of the entire frame without the use of light, eg simple exponential smoothing. This average black frame would be subtracted from each illuminated frame during regular operation. [00145] [00145] Line-Noise is a stochastic temporal variation in pixel displacements within each row. Because it is temporal, the correction must be computed again for each line and each frame. For this purpose, there are usually many optically blind (OB) pixels within each row in the array, which must first be sampled to assess line shift before sampling the light-sensitive pixels. The line offset is then simply subtracted during the line noise correction process. [00146] [00146] In the example of Figure 21, there are other corrections related to getting the data in the proper order, monitoring and controlling the voltage shift in the analog domain (black clamp) and identifying/correcting individual defective pixels. [00147] [00147] Figures 22 and 23 illustrate a method and hardware schematics for increasing dynamic range within an enclosed or limited light environment. In one embodiment, exposure inputs can be introduced at different levels over time and combine to produce greater dynamic range. As can be seen in Figure 22, an imaging system can be cycled at a first intensity for a first cycle at 2202 and then subsequently cycled at a second intensity for a second cycle at 2204 and then combining these first and second cycles in a single frame at 2206, so that greater dynamic range can be achieved. The greater dynamic range can be especially desirable due to the limited ambient space in which an imaging device is used. In limited space environments that are light deficient or dark except for the light provided by the light source, and where the light source is close to the light emitter, exposure has an exponential relationship with distance. For example, objects close to the light source and the optical aperture of the imaging device tend to be overexposed, whereas objects further away tend to be extremely underexposed as there is very little ambient light (if any) present. [00148] [00148] As can be seen in Figure 23, the cycles of a system having electromagnetic energy emissions in a plurality of partitions can cycle through the series in accordance with the electromagnetic spectrum partitions at 2300. For example, in an embodiment in that the emitter emits lasers on a distinct red partition, a distinct blue partition, and a distinct green partition, [00149] [00149] Alternatively, the system can go through the cycle in the form of: i. — red at intensity one at 2302, li. —blue in intensity one in 2302, ii. — green at intensity one at 2302, iv. — red at intensity two in 2304, v. — blue at intensity two in 2304, vi. — green at intensity two at 2304. [00150] [00150] In this embodiment, a first image can be derived from values of intensity one, and a second image can be derived from values of intensity two, and then combined or processed as complete sets of image data at 2310 instead of their component parts. [00151] [00151] It is contemplated as within the scope of the present description that any number of issue partitions may be used in any order. As seen in Figure 23, "n" is used as a variable to denote any number of electromagnetic partitions and "m" is used to denote any intensity level for "n" partitions. This system can go through the cycle in the form of: Í. n at intensity m at 2306, il. n+1 at intensity m+1, [00152] [00152] —Consequently, any pattern of cycles in series can be used to produce the desired image correction, with "i" and "j" being additional values within the operating range of the imaging system. [00153] [00153] Digital color cameras incorporate an image processing stage for the purpose of increasing the fidelity of color reproduction. This is accomplished through a 3 x 3 matrix known as the Color Correction Matrix (CCM - "Color Correction Matrix"): R R a be G| =|G def La: [00154] [00154] Terms in the CCM are adjusted using a set of reference colors (eg from a Macbeth chart) to provide the best total match to the standard SsRGB color space. The diagonal terms, a, e and i, are effectively white balance gains. Typically, however, white balance is applied separately, and the sums of horizontal rows are constrained by the unit, so that no net gain is applied by the CCM itself. Off-diagonal terms efficiently deal with color interference in input channels. Therefore, Bayer sensors have greater off-diagonal terms than cameras with 3 ICs, since the color filter matrices have a lot of response overlap between channels. [00155] [00155] There is a signal-to-noise ratio penalty for color correction that depends on the magnitude of off-diagonal terms. A hypothetical sensor with channels that perfectly match the sRGB components would have the CCM identity matrix: R R 100 G =|G 010 B |OUTPUT B INPUT o o o 1 [00156] [00156] The signal-to-noise ratio evaluated in the green channel for a perfect white photo signal of 10,000 e- per pixel (neglecting read noise) for this case would be: [00157] [00157] “Any deviation from this degrades the SNR. Take, for example, the following CCM that has values that would not be uncommon for a Bayer CMOS sensor: R R 26 —14 -02 G =|G 03 16 03 B OUTPUT B INPUT o 06 16 [00158] [00158] In this case, the green SNR: sn = (3000 +16000 3000), 4/(3000 + 16000 + 3000) [00159] [00159] Figure 24 shows the result of a total SNR simulation using D65 lighting for a typical Bayer CCM sensor for the identity matrix versus fitted CCM use case. The SNR evaluated for the luminance component is about 6dB worse as a consequence of color correction. [00160] [00160] The system described in the present description uses monochromatic lighting in a plurality of discrete wavelengths, so there is no color interference per se. The crosses in Figure 25 indicate the positions of three wavelengths that are available via laser diode sources (465, 532 and 639 nm), compared to the sRGB range which is indicated by the triangle. [00161] [00161] The off-diagonal terms for the CCM are, in this case, drastically reduced compared to Bayer sensors that provide a significant SNR advantage. [00162] [00162] Figure 26 illustrates an imaging system having greater dynamic range as provided by the pixel configuration of the image sensor's pixel array. As can be seen in the figure, adjacent pixels 2602 and 2604 can be set to different sensitivities, so that each cycle includes data produced by pixels that are more and less sensitive to each other. As a plurality of sensitivities can be recorded in a single die cycle, the dynamic range can be greater if recorded in parallel, unlike the time-dependent series nature of other modalities. [00163] [00163] In one embodiment, a matrix may comprise rows of pixels which can be placed in rows based on their sensitivities. In one embodiment, pixels of different sensitivities may shift within a row or column with respect to their nearest neighboring pixels to form a checkerboard-like pattern across the entire array based on these sensitivities. The above may be accomplished through any sharing pixel circuit arrangement or at any independent pixel circuit arrangement. [00164] [00164] A wide dynamic range can be obtained by having multiple global TX, each TX firing only on a different set of pixels. For example, in global mode, a global TX signal 1 is triggering a 1 set of pixels, a global TX signal 2 is triggering a 2 set of pixels.... a global TX signal n is triggering an n set of pixels. [00165] [00165] “Based on Figure 11, Figure 27A shows an example of timing for 2 different pixel sensitivities (double pixel sensitivity) in the pixel matrix. In this case, the global TX 1 signal triggers half of the pixels in the array and the global TX 2 triggers the other half of the pixels. Since TX Global 1 and TX Global 2 have different "on" to "off" edge positions, the integrated light is different between TX1 pixels and TX2 pixels. Figure 27B shows a different mode of timing for dual pixel sensitivity. In this case, the light pulse is modulated twice (pulse duration and/or amplitude). The TX1 pixels integrate the P1 pulse and the TX2 pixels integrate the P1+P2 pulses. The separation of the global TX signals can be performed in several ways. Examples follow: i. — Differentiation of TX lines of each row; and ki. Sending multiple TX lines per row, each addressed to a different set of pixels. [00166] [00166] In one implementation, a means for providing wide dynamic range video is described which exploits the color pulse system described in the present description. The basis of this is having multiple flavors of pixels, or pixels that can be adjusted differently, within the same monochromatic matrix that are capable of integrating incident light for different durations within the same frame. An example of pixel arrangement in the array of this sensor would be a uniform chessboard pattern as a whole, with two independently variable integration times. For this case, it is possible to provide both red and blue information within the same frame. In fact, it is possible to do this at the same time by extending the dynamic range to the green frame where it is needed most, [00167] [00167] An inherent property of the wide-dynamic range (WDR) monochromatic matrix is that the pixels that have the long integration time must integrate a superset of the light seen by the short integration time pixels. For regular operation of the wide dynamic range in the green frames, which is desirable. For the red and blue frames this means that the pulse must be controlled in conjunction with the exposure periods to, for example, provide blue light from the start of the long exposure and switch to red at the point where the short exposure pixels are turned on (both pixel types have their charges transferred at the same time). [00168] [00168] “In the color fusion stage, the two pixel flavors are separated into two temporary memories. Empty pixels are then filled using, for example, linear interpolation. At this point, a temporary memory contains a complete data image of blue and the other of red+blue. The blue buffer can be subtracted from the second buffer to provide pure red data. [00169] [00169] Figures 28A to 28C illustrate the use of an emission of white light that is pulsed and/or synchronized, or held constant, with a corresponding color sensor. As can be seen in Figure 28A, a white light emitter can be configured to emit a beam of light during the blanking period to a corresponding sensor to provide a controlled light source in a controlled light environment. The light source can emit a beam at a constant magnitude and vary the pulse duration as seen in Figure 28A, or it can hold the pulse constant with varying magnitude to obtain correctly exposed data as illustrated in Figure 28B. Illustrated in Figure 28C is a graphical representation of a constant light source that can be modulated with variable current that is controlled by and synchronized with a sensor. [00170] [00170] In one embodiment, white light or multi-spectrum light may be emitted as a pulse, if desired, to provide data for use within the system (best illustrated in Figures 28A to 28C). White light emissions, in combination with partitions of the electromagnetic spectrum, can be useful for emphasizing and de-emphasizing certain aspects within a scene. This mode must use a pulse pattern of: i. green pulse; ki. red pulse; iii. blue pulse; iv. green pulse; v. red pulse; saw. blue pulse; vii. White light pulse (multiple spectrums); viii. (Repeat) [00171] [00171] Any system that utilizes an image sensor cycle that is at least twice as fast as the white light cycle should fall within the scope of the description. It will be recognized that any combination of electromagnetic spectrum partitions is contemplated by the present invention, whether from the visible or non-visible spectrum of the full electromagnetic spectrum. [00172] [00172] Figures 29A and 29B illustrate a perspective view and a side view, respectively, of an implementation of a monolithic sensor 2900 having a plurality of pixel arrays for producing a three-dimensional image in accordance with the teachings and principles of the description. This implementation may be desirable for three-dimensional image capture, as the two pixel arrays 2902 and 2904 can be shifted during use. In another implementation, a first pixel array 2902 and a second pixel array 2904 may be dedicated to receiving a predetermined range of wavelengths of electromagnetic radiation, with the first pixel array being dedicated to a different range of wavelengths. wave of electromagnetic radiation relative to the second pixel matrix. [00173] [00173] Figures 30A and 30B illustrate a perspective view and a side view, respectively, of an implementation of an imaging sensor 3000 constructed on a plurality of substrates. As illustrated, a plurality of pixel columns 3004 that form the pixel array are located on the first substrate 3002 and a plurality of circuit columns 3008 are located on a second substrate 3006. Electrical connection and communication between a column of pixels and its associated or corresponding column of circuit. In one implementation, an image sensor, which may otherwise be manufactured with its pixel array and supporting circuitry on a single monolithic substrate/integrated circuit, may have the pixel array separate from all or most of the supporting circuitry. . The description may use at least two substrates/integrated circuits, which will be stacked using three-dimensional stacking technology. The first 3002 of the two substrates/integrated circuits can be processed using a CMOS imaging process. The first substrate/integrated circuit 3002 can be comprised of either a pixel matrix exclusively or a pixel matrix surrounded by limited circuitry. The second or subsequent 3006 substrate/integrated circuit can be processed using any process and need not be a CMOS imaging process. The 3006 second substrate/integrated circuit can be, but is not limited to, a highly dense digital process for integrating a variety and number of functions in a very limited space or area on the substrate/integrated circuit, or a mixed-mode or analog process. to integrate, for example, precise analog functions, or an RF process to implement wireless capability, or MEMS ("Micro-Electro-Mechanical Systems" - Micro-Electro-Mechanical Systems) to integrate MEMS devices. The CMOS imaging substrate/integrated circuit 3002 can be stacked with the second or subsequent substrate/integrated circuit 3006 using any three-dimensional technique. The second substrate/integrated circuit 3006 can support much or most of the circuitry that would otherwise have been implemented in the first CMOS image integrated circuit 3002 (if implemented on a monolithic substrate/integrated circuit) as peripheral circuits and therefore increased the total area of the system while keeping the pixel array size constant and optimized as much as possible. The electrical connection between the two substrates/integrated circuits can be made through interconnections 3003 and 3005 which can be wire, bump and/or TSV (Through Silicon Via) connections. [00174] [00174] Figures 31A and 31B illustrate a perspective view and a side view, respectively, of an implementation of an imaging sensor 3100 having a plurality of pixel arrays for producing a three-dimensional image. The three-dimensional image sensor may be constructed on a plurality of substrates and may comprise the plurality of pixel arrays and other associated circuitry, wherein a plurality of pixel columns 3104a forming the first pixel array and a plurality of pixel columns 3104b that form a second pixel array are located on respective substrates 3102a and 3102b, respectively, and a plurality of circuit columns 3108a and 3108b are located on a separate substrate 3106. Electrical connections and communications between columns of pixels are also illustrated. to the associated or corresponding circuit column. [00175] [00175] It will be recognized that the teachings and principles of the description may be used on a reusable device platform, a limited-use device platform, a device platform limited to a specific number of uses, or a single-use device platform /disposable without departing from the scope of the description. It will be recognized that on a reusable device platform an end user is responsible for cleaning and sterilizing the device. On a limited-use device platform, the device can be used for a specified amount of times before it becomes inoperative. A typical new device is delivered sterile, with additional uses requiring the end user to clean and sterilize it prior to further use. On a device platform limited to a specific number of uses, a third party can reprocess (eg, clean, package, and sterilize) a single-use device for additional uses at a lower cost than a new unit. On a single-use/disposable device platform, a device is supplied sterile to the operating room and used only once before being discarded. [00176] [00176] One embodiment of an emitter may employ a mechanical shutter and filters to create pulsed color light. An alternative method for producing pulsed color light using a white light source and a mechanical color filter and a 3200 shutter system is illustrated in Figure 32. The wheel could contain a pattern of translucent color filter windows and opaque sections for obturation. The opaque sections would not allow light to enter and would create a period of darkness in which the sensor reading could occur. The white light source could be based on any technology: laser, LED, xenon, halogen, metal halide, or other. White light can be projected through a series of 3207, 3209 and 3211 color filters to the desired pattern of colored light pulses. A standard mode could be Red filter 3207, Green filter 3209, Blue filter 3211, Green filter 3209. The filters and the 3200 shutter system could be arranged on a wheel that rotates at the required frequency to stay in sync with the sensor, so knowledge of the arc length and rotation rate of mechanical color filters 3207, 3209, and 3211 and shutter system 3205 would provide timing information for the operation of a corresponding monochrome image sensor. [00177] [00177] Illustrated in Figure 33, one embodiment may comprise a pattern of only translucent color filters 3307, 3309, and 3311 on a filter wheel 3300. In the present configuration, a different shutter may be used. The shutter could be mechanical and could dynamically adjust the duration of the "pulse" by varying the size. Alternatively, the shutter could be electronic and incorporated into the sensor design. A motor that rotates the 3300 filter wheel will need to communicate with or be controlled in conjunction with the sensor, so knowledge of the arc length and rotation rate of the 3307, 3309, and 3311 color mechanical filter system provides information timing for the operation of the corresponding monochrome image sensor. The control system will need to know the proper color filter for each frame captured by the sensor so that the full color image can be properly reconstructed at the ISP. An RGBG color standard is shown, however other colors and/or standards could be used if advantageous. The relative size of the patches is shown to be equal, but could be adjusted if advantageous. The mechanical structure of the filter is shown as a circle that moves rotationally, but could be rectangular with a linear motion, or a different shape with a different pattern of motion. [00178] [00178] As illustrated in Figure 34, a modality for pulsing colored light may consist of a mechanical wheel or cylinder that contains the electronics and heat sinks for Red, Green, Blue, or White LEDs. The LEDs would be spaced at a distance that would be related to the rotation or twist rate of the cylinder or wheel to allow for light pulse timing consistent with other modalities in the patent. The wheel or cylinder would be rotated using an electric motor and a mechanical clamp that secures the wheel or cylinder to the electric motor. The motor would be controlled using a microcontroller, FPGA, DSP, or other programmable device that would contain a control algorithm for proper timing as described in the patent. There would be a mechanical opening on one side that would be optically coupled to an optical fiber to transport the fiber to the end of the scopes with the methods described in the patent. This coupling could also have a mechanical opening that could open and close to control the amount of light allowed through the fiber optic cable. This would be a mechanical shutter device that could alternatively use the electronic shutter that is designed into the CMOS or CCD type sensor. It would be difficult to control and calibrate this device in production, but it would be another way to get pulsed light inside our system. [00179] [00179] Illustrated in Figure 35 is an embodiment of an emitter 3502 comprising a linear filter 3504 and a shutter mechanism for delivering pulsed electromagnetic radiation. The 3504 lihear filter and shutter mechanism move horizontally at a frequency required to filter out the proper wavelengths of light. [00180] [00180] Illustrated in Figure 36 is an embodiment of an emitter 3602 comprising a prism filter 3604 and a shutter mechanism for delivering pulsed electromagnetic radiation. The 3604 prism filter filters light and releases an emission that may include a shutter. The 3604 prism filter moves at a frequency required to provide a correct color emission pattern. [00181] [00181] Additionally, the teachings and principles of the description may include any and all wavelengths of electromagnetic energy, including visible and non-visible spectra such as infrared (IR), ultraviolet (UV) and X-rays. [00182] [00182] Figure 37 is a schematic diagram illustrating a 3700 system for providing lighting in a light-deficient environment, such as for endoscopic imaging. The 3700 system can be used in combination with any of the systems, methods, or devices described in the present invention. The 3700 system includes a 3702 light source, a 3704 controller, a 3706 bridge waveguide, a 3708 waveguide connector, a 3710 lumen waveguide, a 3712 lumen, and an image sensor 3714 with components. optical attachments (eg a lens). The 3702 light source generates light that travels through the 3706 bridge waveguide and 3710 lumen waveguide to illuminate a scene at a distal end of the 3712 lumen. The 3700 light source can be used to emit any length waveform of electromagnetic energy including visible, infrared, ultraviolet or other wavelengths. The 3712 Lumen can be inserted into a patient's body for imaging, such as during a procedure or exam. Light is supplied as illustrated by dashed lines 3716. A scene illuminated by light can be captured using the image sensor [00183] [00183] In one embodiment, the lumen waveguide 3710 includes one or a plurality of optical fibers. Optical fibers can be produced from a low-cost material, such as plastic, to allow for disposal of the 3710 lumen waveguide and/or other portions of an endoscope. In one embodiment, a single fiberglass having a diameter of 500 microns can be used. The 3706 Bridge Waveguide can be permanently attached to the light source [00184] [00184] Figures 38 to 40 are schematic block diagrams illustrating a light source 3800 having a plurality of emitters. Referring to Figure 38, emitters include a first emitter 3802, a second emitter 3804, and a third emitter 3806. Additional emitters may be included, as discussed further below. Emitters 3802, 3804, and 3806 may include one or more laser emitters that emit light at different wavelengths. For example, the first emitter 3802 can emit a wavelength that is consistent with a blue laser, the second emitter 3804 can emit a wavelength that is consistent with a green laser, and the third emitter 3806 can emit a wavelength that is consistent with a red laser. For example, first emitter 3802 may include one or more blue lasers, second emitter 3804 may include one or more green lasers, and third emitter 3806 may include one or more red lasers. The 3802, 3804, and 3806 emitters emit laser beams toward a 3808 collection region, which may be the location of a waveguide, lens, or other optical component to collect and/or deliver light to a waveguide, such as the 3706 bridge waveguide or 3710 lumen waveguide of Figure 37. [00185] [00185] In an implementation where a patient has been given a reagent or dye to aid in the identification of certain tissues, structures, chemical reactions, biological processes, etc., emitters 3802, 3804, and 3806 can emit wavelength(s) for the reagents or dyes show fluorescence. These wavelength(s) can be determined based on the reagents or dyes administered to the patient. In this embodiment, emitters may need to be highly accurate to emit desired wavelength(s) to fluoresce or activate certain reagents or dyes. [00186] [00186] In the embodiment of Figure 38, the emitters 3802, 3804, 3806 each deliver laser light to the collection region 3808 at different angles. The variation in angle can lead to variations in which electromagnetic energy is situated in an output waveguide. For example, if light immediately passes into a bundle of fibers (glass or plastic) in collection region 3808, the varying angles can cause different amounts of light to enter different fibers. For example, the angle may result in variations in intensity across the collection region 3808. Also, the light from different emitters may not be homogeneously mixed, so that some fibers may receive different amounts of light of different colors. The variation in color or light intensity in different fibers can lead to suboptimal lighting of a scene. For example, variations in light intensities or released light can result in the scene and captured images. [00187] [00187] In one embodiment, an intervening optical element may be placed between a bundle of fibers and the emitters 3802, 3804, 3806 to mix the different colors (wavelengths) of light before entering the fibers or other waveguide. Examples of intervening optical elements include a diffuser, a mixing rod, one or more lenses or other optical components that mix light so that a given fiber receives an equal amount of each color (wavelength). For example, each fiber in the fiber bundle can have the same color. This mixing can lead to the same color in each fiber, but can, in some embodiments, still result in a different overall gloss applied to different fibers. In one embodiment, the intervening optical element may also scatter or even block light over the collection region, so that each fiber has the same total amount of light (eg, the light may be scattered in a top hat profile). A diffuser or mixing rod can cause the light to be lost. [00188] [00188] Although collection region 3808 is represented as a physical component in Figure 38, collection region 3808 may simply be a region where light from emitters 3802, 3804 and 3806 is released. In some cases, collection region 3808 may include an optical component, such as a diffuser, mixing rod, lens, or any other optical component intervening between emitters 3802, 3804, 3806 and an output waveguide. [00189] [00189] Figure 39 illustrates an embodiment of a light source 3800 with emitters 3802, 3804, 3806 providing light to the collection region 3808 at an equal or substantially equal angle. Light is supplied at an angle substantially perpendicular to collection region 3808. Light source 3800 includes a plurality of dichroic mirrors including a first dichroic mirror 3902, a second dichroic mirror 3904 and a third dichroic mirror 3906. Dichroic mirrors 3902, 3904, 3906 include mirrors which reflect a first wavelength of light, but which transmit (or are transparent to) a second wavelength of light. For example, the third dichroic mirror 3906 can reflect blue laser light provided by the third emitter, while being transparent to the red and green light provided by the first emitter 3802 and the second emitter 3804, respectively. The second dichroic mirror 3904 may be transparent to red light from the first emitter 3802, but reflective to green light from the second emitter 3804. If other colors or wavelengths are included, dichroic mirrors may be selected to reflect corresponding light to at least one emitter. and be transparent to other issuers. For example, the third dichroic mirror 3906 reflects light from the third emitter 3806, but to emitters "behind" it, for example the first emitter 3802 and the second emitter 3804. In embodiments where tens or hundreds of emitters are present, each A dichroic mirror can be reflective to a corresponding emitter and emitters in front of it while being transparent to emitters behind it. This could allow tens or hundreds of emitters to emit electromagnetic energy to the 3808 collection region at a substantially equal angle. [00190] [00190] Due to the fact that dichroic mirrors allow other wavelengths to be transmitted or pass through them, each of the wavelengths can arrive at the 3808 collection region from the same angle and/or with the same center or focal point. Providing light from the same angle and/or the same focal/central point can significantly improve the reception and mixing of colors in the 3808 collection region. For example, a specific fiber can receive the different colors in the same proportions as they were transmitted/reflected by emitters 3802, 3804, 3806 and mirrors 3902, 3904, 3906. Light mixing can be significantly improved in the collection region compared to the embodiment of Figure 38. In one embodiment, any optical components discussed here can be be used in the 3808 collection region to collect light before delivering it to a fiber or fiber bundle. [00191] [00191] Figure 40 illustrates an embodiment of a light source 3800 with emitters 3802, 3804, 3806 that also provide light to the collection region 3808 at an equal or substantially equal angle. However, the incident light in the collection region 3808 is shifted with respect to the perpendicular. Angle 4002 indicates the angle offset from the perpendicular. In one embodiment, laser emitters 3802, 3804, 3806 may have cross-sectional intensity profiles that are Gaussian. As discussed earlier, improved distribution of light energy between fibers can be achieved by creating a flatter or hat-shaped intensity profile. In one embodiment, as angle 4002 is increased, [00192] [00192] The top hat profile can also be obtained using one or more lenses, diffusers, mixing rods or any other optical component intervening between the 3802, 3804, 3806 emitters and an output waveguide, fiber or beam. optical fibers. [00193] [00193] Figure 41 is a schematic diagram illustrating a single optical fiber 4102 emitting through a diffuser 4104 at an output. In one embodiment, the optical fiber 4102 may have a diameter of 500 microns and have a numerical aperture of 65 and emit a cone of light 4106 of about 70 or 80 degrees without a diffuser 4104. With the diffuser 4104, the cone of light 4106 4106 light can have an angle of about 110 or 120 degrees. The 4106 light cone can be a main part where all the light comes from and is evenly distributed. The 4104 diffuser can allow a more uniform distribution of electromagnetic energy from a scene observed by an image sensor. [00194] [00194] In one embodiment, the 4102 lumen waveguide may include a single plastic or glass optical fiber of about 500 microns. Plastic fiber can be inexpensive, but the width can allow the fiber to carry a sufficient amount of light for a scene, with coupling, diffuser, or other losses. For example, smaller fibers may not be able to carry as much light or energy as a larger fiber. The lumen waveguide 3710 may include a single fiber or a plurality of optical fibers. The 3702 lumen waveguide can receive light directly from the light source or through a bridge waveguide (for example, see the 3706 bridge waveguide in Figure 37). A diffuser can be used to magnify the 3706 light output to a desired field of view of the 3714 image sensor or other optical components. [00195] [00195] “Although three emitters are shown in Figures 38 to 40, emitters in numbers from one to hundreds or more can be used in some embodiments. Emitters may have different wavelengths or spectra of the light they emit, and which can be used to contiguously cover a desired portion of the electromagnetic spectrum (eg, the visible spectrum, as well as the infrared and ultraviolet spectra). [00196] [00196] In one embodiment, a light source with a plurality of emitters may be used for multispectral or hyperspectral imaging in a light-deficient environment. For example, different chemicals, materials or fabrics may have different responses to different colors or wavelengths of electromagnetic energy. Some tissues have their own spectral signature (how they respond or vary when reflecting wavelengths of electromagnetic radiation) In one embodiment, a specific type of tissue can be detected based on how it responds to a specific wavelength or a specific combination of wavelengths. For example, blood vessel tissues can absorb and reflect different wavelengths or spectrums of electromagnetic energy in a unique way to distinguish them from muscle, fat, bone, nerve, ureter, or other tissues or materials in the body. In addition, specific types of muscle or other tissue types can be distinguished based on their spectral response. Tissue disease states can also be determined based on spectral information. See US Patent No. 8,289,503. See also US Patent No. [00197] [00197] In one embodiment, fluorescent image data and/or multispectral or hyperspectral image data may be obtained by using one or more filters to filter out all light or electromagnetic energy except at the desired wavelength or spectrum. Figure 42 is a block diagram illustrating a filter 4202 for filtering out unwanted wavelengths before light 4208 (or other electromagnetic radiation) encounters an imaging sensor 4204 or other imaging medium (e.g., film). In one embodiment, white light 4208 passes through filter 4202 and filtered light 4210 passes through a lens 4206 to be focused on imaging sensor 4204 for capturing and reading the image. The filter can be located anywhere in the system, or it can be an attribute of the 4206 lens or the 4204 image sensor. [00198] [00198] In a light-deficient environment, the 4208 light may include white light emitted by an emitter in the light-deficient environment. Filter 4202 can be selected for the desired exam. For example, if it is desired to detect or highlight a specific tissue, filter 4202 can be selected to allow passage of wavelengths corresponding to the spectral response of the specific tissue or the fluorescence emission of a specific reagent. The 4204 image sensor, which may include a monochrome image sensor, can generate an image. Pixels in the captured image that exceed a threshold or fall below a threshold can then be characterized as corresponding to the specific tissue. This data can then be used to generate an image that indicates the location of the specific tissue. [00199] [00199] In another embodiment, a dye or fluorescence reagent can be used to image specific tissue types, pathways, or the like in a body. For example, a fluorescence dye can be administered to a patient and then an image of the dye can be captured. In one embodiment, dye fluorescence may be initiated using a specific wavelength of electromagnetic energy. For example, the dye can only fluoresce when electromagnetic energy is present. [00200] [00200] However, both filters and fluorescence dyes significantly restrict the examination. For example, if a filter is used, the desired spectral response that can be detected, and thus the material or tissue that can be detected, is limited by the filters available. Also, filters may need to be changed or replaced. With regard to dyes, dye must be administered prior to imaging and there may be conflicts between administration of different dyes for different purposes during the same exam. Thus, tests using filters and dyes can take a long time and may require many different tests to obtain the desired information. [00201] [00201] In one embodiment, multispectral or hyperspectral imaging in a light-deficient environment can be accomplished using a monochromatic image sensor and emitters that emit a plurality of different wavelengths or spectra of electromagnetic energy. In one embodiment, a light source or other electromagnetic source (such as a light source 3800 in any of Figures 38 to 40) may include a plurality of emitters to cover desired spectra. [00202] [00202] Figure 43 illustrates a portion of the 4300 electromagnetic spectrum divided into twenty different subspectra. The number of subspectra is illustrative only. In at least one embodiment, the 4300 spectrum can be divided into hundreds of subspectra, each with a small waveband. The spectrum can extend from the infrared spectrum 4302, through the visible spectrum 4304, to the ultraviolet spectrum 4306. Each of the subspectra has a waveband 4308 that covers a portion of the spectrum 4300. Each waveband can be defined by a higher wavelength and a lower wavelength. [00203] [00203] In one embodiment, at least one emitter (such as a laser emitter) may be included in a light source (such as light sources 3702, 3800 in Figures 37 to 40) for each subspectrum to provide complete coverage and contiguous across the spectrum [00204] [00204] The waveband widths and coverage provided by the emitters can be selected to provide any desired combination of spectra. For example, contiguous coverage of a spectrum using very small waveband widths (eg, 10 nm or less) can allow highly selective hyperspectral imaging. Since the wavelengths come from emitters that can be selectively activated, extreme flexibility can be gained in determining spectral responses of a material during an examination. Thus, much more information about the spectral response can be obtained in less time and within a single exam, which would otherwise have required several exams, generated delays due to the administration of dyes or stains, or the like. In one embodiment, a system can capture hyperspectral image data and process that data to identify what type of tissue exists in each pixel. [00205] [00205] Figure 44 is a schematic diagram illustrating a timing diagram 4400 for emission and reading to generate a multispectral or hyperspectral image, according to an embodiment. The solid line represents read (peaks 4402) and suppression (valleys) periods to capture a series of frames 4404-4414. The 4404-4414 frame series can include a repeating series of frames that can be used to generate hyperspectral data for a video feed stream. The series of frames includes a first frame 404, a second frame 4406, a third frame 4408, a fourth frame 4410, a fifth frame 4412, and an nth frame 4426. [00206] [00206] In one embodiment, each frame is generated based on at least one pulse of electromagnetic energy. The electromagnetic energy pulse is reflected and detected by an image sensor and then read in a subsequent reading (4402). Thus, each period of blanking and reading results in an image frame for a specific spectrum of electromagnetic energy. For example, the first frame 404 can be generated based on a spectrum of a first of one or more pulses 4416, a second frame 4406 can be generated based on a spectrum of a second of one or more pulses 4418, a third frame 4408 may be generated based on a spectrum of a third of one or more pulses 4420, a fourth frame 4410 may be generated based on a spectrum of a quarter of one or more pulses 4422, a fifth frame 4412 may be generated based on on a spectrum of one fifth of one or more 4424 pulses, and an nth frame 4426 can be generated based on a spectrum of one nth of one or more 4426 pulses. [00207] [00207] Pulses 4416 to 4426 can include power from a single emitter or a combination of two or more emitters. For example, the spectrum included in a single reading period or within the plurality of frames 4404 to 4414 can be selected for a desired examination or detection of a specific tissue or condition. According to one embodiment, one or more pulses may include light from the visible spectrum to generate a color or black and white image, while one or more additional pulses are used to obtain a spectral response to classify a tissue type. For example, pulse 4416 may include red light, pulse 4418 may include blue light, and pulse 4420 may include green light, while the remaining pulses 4422 to 4426 may include wavelengths and spectra to detect a specific tissue type. As an additional example, pulses for a single reading period may include a spectrum generated from multiple different emitters (eg different slices of the electromagnetic spectrum) that can be used to detect a specific type of tissue. For example, if the combination of wavelengths results in a pixel having a value that exceeds or falls below a threshold, that pixel can be classified as corresponding to a specific tissue type. Each frame can be used to further narrow down the type of tissue that is present at that pixel (e.g. each pixel in the image) to provide a very specific tissue classification and/or tissue state (diseased/healthy) based on the response spectral. [00208] [00208] The plurality of frames 4404 to 4414 are shown having varying lengths in read periods and pulses having different lengths or intensities. The blanking period, pulse length or intensity, or the like can be selected based on the sensitivity of a monochromatic sensor to the specific wavelength, the energy-emitting capacity of the emitter(s) and/or the waveguide carrying capacity. [00209] [00209] A hyperspectral image or hyperspectral image data obtained in the manner illustrated in Figure 44, may result in a plurality of frames, each based on a different spectrum or combination of spectra. In some cases, tens or hundreds of different frames can be obtained. In other cases, for example for video streams, the number of frames may be limited to provide a viewable frame rate. Since combinations of different spectra can be provided in a single reading period, useful and dynamic spectral information can still be obtained even in a video stream. [00210] [00210] In one embodiment, a video or other image may include a black and white or color image overlaid with information derived from the spectral response for each pixel. For example, pixels that correspond to a specific tissue or state can be displayed in a bright green or other color to assist a physician or other clinical specialist during an examination. [00211] [00211] In one embodiment, dual image sensors can be used to obtain three-dimensional images or video feed streams. A three-dimensional examination can allow a better understanding of a three-dimensional structure of the region being examined, as well as a mapping of the different types of tissue or material within the region. [00212] [00212] In one embodiment, multispectral or hyperspectral imaging may be used for visualization through materials or substances. For example, infrared wavelengths can pass through some tissues, such as muscle or fat, while reflecting off blood vessels. In one embodiment, infrared waves can penetrate 5, 8, or 10 mm or more into tissue. Obtaining a series of frames that include at least one infrared frame can allow an exam to provide information about the location of blood vessels below the surface. This can be extremely useful for surgical procedures where it may be desirable to make incisions that bypass blood vessels. In one embodiment, a color or grayscale image may be overlaid with a green color that indicates the location of blood vessels below the surface. Similarly, a known spectral response of blood can be used to look through the blood and visualize the tissues or structures of interest in an exam. [00213] [00213] The set of subframes into a single frame for display on a monitor or other display device may occur after capturing the series of frames 4404 to 4414. A color or grayscale image may be generated from one or more more of the frames and pixel overlap information can be determined based on all frames or the remaining frames. The color or grayscale image can be combined with the overlay information to generate a single frame. The single frame can be displayed as a single image or as an image in a video stream. [00214] [00214] In one embodiment, the hyperspectral data obtained as illustrated in Figure 44 can be provided for analysis by a third party algorithm to classify a tissue or material captured in the image. In one embodiment, the third party algorithm can be used to select the spectra or wavebands to be used during imaging so that an analysis of the desired spectral response can be performed. In one embodiment, spectral response analysis may be performed in real time during a medical imaging procedure or other medical procedure. Spectral data can be overlaid on an RGB or black and white image so that a user can readily differentiate between certain types of tissues, organs, chemical processes, diseases, etc. In one embodiment, spectral data may be provided to a computer-operated system, such as a robotics system, for automation of medical imaging or medical procedures. [00215] [00215] Figure 45 is a schematic diagram of an imaging system 4500 having a single cut filter. The 4500 system includes a 4506 endoscope or other suitable imaging device having a 4508 light source for use in a light-deficient environment. The 4506 endoscope includes a 4504 image sensor and a 4502 filter to filter out unwanted light wavelengths or other electromagnetic radiation before reaching the 4504 image sensor. The 4508 light source transmits light that can illuminate the 4512 surface in an environment with light deficiency, for example, a body cavity. Light 4510 is reflected from surface 4512 and passes through filter 4502 before reaching image sensor 4504. [00216] [00216] Filter4502 can be used in an implementation where a fluorescent dye or reagent has been administered. In this embodiment, filter 4502 is configured to filter out all light other than one or more desired wavelengths or spectral bands of light or other electromagnetic radiation. In one embodiment, the 4502 filter is configured to filter an excitation wavelength of electromagnetic radiation that causes a reagent or dye to fluoresce, so that only the expected relaxation wavelength of the fluoresced reagent or dye is allowed to pass through the filter. 4502 and hit the image sensor [00217] [00217] Filter 4502 can be additionally used in an implementation where a fluorescent dye or reagent has not been administered. Filter 4502 may be selected to allow wavelengths corresponding to a desired spectral response to pass through and be read by image sensor 4504. Image sensor 4504 may be a monochrome image sensor, so that pixels in the captured image that exceed a threshold or fall below a threshold can be characterized as corresponding to a particular spectral response or fluorescence emission. Spectral response or fluorescence emission, as determined by pixels captured by the 4504 image sensor, can indicate the presence of a particular tissue or body structure, a particular condition, a particular chemical process, and so on. [00218] [00218] In one embodiment, the light source 4508 transmits white light that contacts the surface 4512 and is reflected back where it is filtered by the filter 4502 before reaching the image sensor 4504. In one embodiment, the light source 4508 transmits white light that passes through filter 4502, so that filtered light of only one or more desired wavelengths emerges from filter 4502 to be reflected from surface 4512 and read by image sensor 4504. For example, in a In one embodiment, the 4502 filter allows only light having a wavelength of 795 nm to pass through the 4502 filter and contact the 4504 image sensor. Additionally in one embodiment, the 4502 filter allows only certain wavelengths of light are reflected back to the 4504 image sensor from the 4506 endoscope or other imaging device. Filter 4502 may be located anywhere in the system 4500 or may be an attribute of a lens or image sensor 4504. Filter 4502 may be located in front of and/or behind image sensor 4504. In one embodiment, light emitted by light source 4508 is filtered before reaching surface 4512 and reflected light is filtered by an additional filter before being read by image sensor 4504. [00219] [00219] The 4508 light source can be an emitter that can be configured to emit white light or electromagnetic radiation of one or more specific wavelengths. Light source 4508 may include a plurality of lasers configured to emit or pulse light of specified wavelengths. In one embodiment, light source 4508 emits white light and filter 4502 is selected to filter out all unwanted light other than one or more desired wavelengths of light or other electromagnetic radiation. The 4502 filter can be selected for a specific examination or purpose, for example, to highlight a tissue type or body structure, or to highlight a particular chemical condition or process. [00220] [00220] Figure 46 is a schematic diagram of an imaging system 4600 having multiple cut filters. The 4600 system includes a 4606 endoscope or other suitable imaging device having a 4608 light source for use in a light-deficient environment. The 4606 endoscope includes an image sensor 4604 and two filters 4602a, 4602b. It should be recognized that, in alternative embodiments, the 4600 system may include any number of filters, and the number and type of filters may be selected for a particular purpose, for example, to collect imaging information of a particular body tissue, condition body, chemical process etc. Filters 4602a, 4602b are configured to filter out unwanted wavelengths of light or other electromagnetic radiation. Filters 4602a, 4602b may be configured to filter out unwanted wavelengths of white light or other electromagnetic radiation that may be emitted by the light source 4608. The filtered light may reach the surface 4612 (e.g., body tissue) and be reflected from back to the 4604 image sensor. [00221] [00221] In addition to the description related to Figure 45, filters 4602a, 4602b can be used in an implementation where a fluorescent dye or reagent has been administered. Filters 4602a, 4602b can be configured to block an excitation wavelength emitted to the reagent or dye and allow the 4604 image sensor to only read the relaxation wavelength of the reagent or dye. Additionally, filters 4602a, 4602b can be used in an implementation where a fluorescent dye or reagent has not been administered. In this implementation, filters 4602a, 4602b can be selected to allow wavelengths corresponding to a desired spectral response to pass through and be read by the image sensor 4604. [00222] [00222] Multiple filters 4602a, 4602b can be individually configured to filter a different range of wavelengths from the electromagnetic spectrum. For example, one filter can be configured to filter out wavelengths longer than a desired wavelength range, and the additional filter can be configured to filter out wavelengths smaller than the desired wavelength range. The combination of the two or more filters may result in the image sensor reading only a particular wavelength or band of wavelengths. [00223] [00223] In one embodiment, filters 4602a, 4602b are customized so that electromagnetic radiation between 513 nm and 545 nm contacts the 4604 image sensor. In one embodiment, filters 4602a, 4602b are customized, so so that electromagnetic radiation between 565nm and 585nm contacts the 4604 image sensor. In one embodiment, filters 4602a, 4602b are customized so that electromagnetic radiation between 900nm and 1000nm contacts the image sensor [00224] [00224] In one embodiment, the system 4600 includes multiple image sensors 4604 and may particularly include two image sensors for use in generating a three-dimensional image. The 4604 image sensor(s) may be color/wavelength agnostic and configured to read any wavelength of electromagnetic radiation that is reflected from the 4612 surface. In one embodiment, the image sensors 4604 are each color-dependent or wavelength-dependent and configured to read electromagnetic radiation of a particular wavelength that is reflected from the surface 4612 and back to the sensors. image sensor 4604. Alternatively, image sensor 4604 may include a single image sensor with a plurality of different pixel sensors configured to read different wavelengths or colors of light, such as a Bayer filter color filter array. Alternatively, the 4604 image sensor may include one or more color-agnostic image sensors that can be configured to read different wavelengths of electromagnetic radiation according to a pulse schedule such as those illustrated in Figures 5 through 7E and 15. to 16, for example. [00225] [00225] Figure 47 is a schematic diagram illustrating a 4700 system for mapping a surface and/or tracking an object in a light-deficient environment. In one embodiment, an endoscope 4702 in a light-deficient environment pulses a grid array 4706 (which may be termed a laser map pattern) over a surface 4704. The grid array 4706 may include vertical perforation 4708 and horizontal perforation. 4710 in an embodiment as illustrated in Figure [00226] [00226] In one embodiment, the 4700 system pulses a 4706 grid matrix that can be used to determine a three-dimensional surface and/or track a location of an object, e.g. a tool or other device in a light-deficient environment . In one embodiment, system 4700 may provide data to a third-party computer system or algorithm to determine surface dimensions and configurations through light detection and ranging (LIDAR) mapping. The 4700 system can pulse any suitable wavelength of light or electromagnetic radiation into the 4706 grid array, including, for example, ultraviolet light, visible light, and/or infrared or near infrared light. The 4704 surface and/or objects within the environment can be mapped and tracked at very high resolution and with very high accuracy and precision. [00227] [00227] In one embodiment, the system 4700 includes an imaging device having a tube, one or more image sensors, and a lens array having an optical element corresponding to the one or more image sensors. The 4700 system may include a light engine having an illumination source that generates one or more pulses of electromagnetic radiation and a lumen that transmits the one or more pulses of electromagnetic radiation at a distal tip of an endoscope in a light-deficient environment. , for example, a body cavity. In one embodiment, at least a portion of the one or more pulses of electromagnetic radiation includes a laser map pattern that is emitted over a surface within the light-deficient environment, such as a surface of body tissue and/or a surface of tools. or other devices within the body cavity. The 4702 endoscope can include a two-dimensional, three-dimensional, or n-dimensional camera to map and/or track the surface, dimensions, and configurations within the light-deficient environment. [00228] [00228] In one embodiment, the 4700 system includes a processor for determining a distance of an endoscope or tool from an object, e.g., the 4704 surface. The processor may additionally determine an angle between the endoscope or tool and the object. . The processor may additionally determine surface area information about the object, including, for example, the size of surgical tools, the size of structures, the size of anatomical structures, location information, and other position data and metrics. The 4700 system may include one or more image sensors that provide image data that is fed to a control system to determine a distance of an endoscope or tool from an object, for example, the surface 4704. Image sensors can send information to a control system to determine an angle between the endoscope or tool and the object. Additionally, image sensors can send information to a control system to determine surface area information about the object, the size of surgical tools, the size of structures, the size of anatomical structures, location information, and other data and metrics of position. [00229] [00229] In one embodiment, the 4706 grid array is pulsed by an endoscope 4702 illumination source at a sufficient rate that the 4706 grid array is not visible to a user. In many implementations, it can be distracting for a user to see the 4706 grid matrix during an endoscopic imaging procedure and/or endoscopic surgical procedure. The 4706 grid array can be pulsed for sufficiently brief periods that the 4706 grid array cannot be detected by the human eye. In an alternative embodiment, the endoscope 4702 pulses the grid matrix 4706 at a recurring frequency sufficient for the grid matrix 4706 to be visualized by a user. In this embodiment, the grid matrix 4706 may be superimposed on an image of the surface 4704 on a monitor. The 4706 grid matrix can be superimposed on a black and white or RGB image of the 4704 surface so that the 4706 grid matrix can be visible to a user while using the 4700 system. A user of the 4700 system can indicate whether the 4706 grid array should be overlaid on an image of the 4704 surface and/or whether the 4706 grid array should be visible to the user. The 4700 system can include a monitor that provides real-time measurements of a distance from the 4702 endoscope to the 4704 surface or other object within the light-deficient environment. The monitor can additionally provide real-time surface area information about the 4704 surface and/or any objects, structures or tools within the light-deficient environment. The accuracy of measurements can be less than one millimeter accurate. [00230] [00230] “The 4702 endoscope can pulse electromagnetic radiation according to a pulsation schedule like that illustrated in Figures 5 through 7E and 15 through 16, for example, which may additionally include the pulsation of the 4706 grid array with the Red light pulsation , Green and Blue to generate an RGB image and additionally generate a 4706 grid matrix that can be overlaid on the RGB image and/or used to map and track the 4704 surface and objects within the light-deficient environment. [00231] [00231] In one embodiment, the 4702 endoscope includes one or more color-agnostic image sensors. In one embodiment, the 4702 endoscope includes two color-agnostic image sensors to generate a three-dimensional image or map of the light-deficient environment. The image sensors can generate an RGB image of the light-deficient environment according to a pulse schedule as described in the present invention. Additionally, image sensors can determine data to map the light-deficient environment and track one or more objects within the light-deficient environment based on the data determined when the 4706 grid array is pulsed. Additionally, image sensors can determine spectral or hyperspectral data with fluorescence imaging data according to a pulse schedule that can be modified by a user to suit the needs of a particular imaging procedure. In one embodiment, a pulsing schedule includes Red, Green, and Blue pulses with a 4706 grid array pulsing and/or pulsing to generate hyperspectral image data and/or fluorescence image data. In many implementations, the pulse schedule can include any suitable combination of electromagnetic radiation pulses according to a user's needs. The recurring frequency of different wavelengths of electromagnetic radiation can be determined based on, for example, the energy of a given pulse, user needs, whether certain data (e.g. hyperspectral data and/or fluorescence imaging data) need to be continually updated or can be updated less frequently, and so on. [00232] [00232] The pulse schedule can be modified in any suitable way, and certain pulses of electromagnetic radiation can be repeated at any suitable frequency according to the needs of a user or computer-implemented program for a given imaging procedure. For example, in an embodiment where surface tracking data generated based on the 4706 grid array is fed to a computer-implemented program for use in, for example, a robotic surgical procedure, the 4706 grid array may be pulsed. more often than if surface tracking data were provided to a user who is viewing the scene during the imaging procedure. In this embodiment where the surface tracking data is used for a robotic surgical procedure, the surface tracking data may need to be updated more frequently or may need to be overly accurate so that the computer-implemented program can perform the robotic surgical procedure with precision and accuracy. [00233] [00233] In one embodiment, the 4700 system is configured to generate an occupancy grid map comprising an array of gridded cells. The 4700 system is configured to store height values for each of its grid cells to determine a surface mapping of a three-dimensional environment in a light-deficient environment. [00234] [00234] Figure 48 is a schematic flow diagram for a 4800 method for hyperspectral imaging in a light-deficient environment. Method 4800 can be performed by an imaging system, such as an endoscopic imaging system illustrated in Figure 37. [00235] [00235] Method 4800 includes emitting at 4802 a plurality of narrowband pulses during reading periods from a monochrome image sensor. Pulses can be emitted at 4802 using a light source that includes a plurality of emitters that emit electromagnetic energy within narrow frequency bands. For example, the light source may include at least one emitter for a plurality of frequency bands covering a desired spectrum. A monochrome image sensor reads at 4804 the pixel data from the monochrome image sensor after reading periods to generate a plurality of frames. Each frame can include different spectral content. These frames may include a plurality of repeat frames that may be used to generate a digital video stream. Each frame can be based on the energy emitted by one or more emitters from the light source. In one embodiment, a frame may be based on a combination of light emitted by light sources to generate a combination of frequencies to match a frequency response of a desired tissue or substance. A controller, CCU, or other system determines at 4806 a spectral response of a tissue to one or more pixels based on the plurality of frames. For example, pixel values and knowledge of the frequencies of light emitted for each frame can be used to determine a frequency response for a specific pixel, based on the pixel values in the plurality of frames. The system may generate at 4808 a blended image based on the plurality of frames, the blended image comprising an overlay indicating the spectral response for the one or more pixels. For example, the combined image can be a grayscale or color image where pixels corresponding to a specific fabric or classification are shown in bright green. [00236] [00236] Figure 49 is a schematic flow diagram for a 4900 method for fluorescence imaging in a light-deficient environment. Method 4900 can be performed by an imaging system, such as an endoscopic imaging system illustrated in Figure 37. [00237] [00237] —Method 4900 includes emitting at 4902 a plurality of narrowband pulses during reading periods from a monochrome image sensor. Pulses can be emitted at 4902 using a light source that includes a plurality of emitters that emit electromagnetic energy within narrow frequency bands. For example, the light source may include at least one emitter for a plurality of frequency bands covering a desired spectrum. A monochrome image sensor reads at 4904 the pixel data from the monochrome image sensor after reading periods to generate a plurality of frames. Each frame can include different spectral content. These frames may include a plurality of repeat frames that may be used to generate a digital video stream. Each frame can be based on the energy emitted by one or more emitters from the light source. In one embodiment, a frame may be based on a combination of light emitted by light sources to generate a combination of frequencies to match a frequency response of a desired tissue or substance. A controller, CCU, or other system determines at 4906 a fluorescence relaxation emission from a reagent to one or more pixels based on the plurality of frames. For example, pixel values and knowledge of the frequencies of light emitted for each frame can be used to determine a frequency response for a specific pixel, based on the pixel values in the plurality of frames. The system may generate at 4908 a blended image based on the plurality of frames, the blended image comprising an overlay indicating fluorescence relaxation emission for the one or more pixels. For example, the combined image can be a grayscale or color image where pixels corresponding to a specific fabric or classification are shown in bright green. [00238] [00238] The following examples refer to additional modalities: [00239] [00239] Example 1 is an endoscopic system for use in a light-deficient environment. The system includes an imaging device. The imaging device includes a tube, one or more image sensors and a lens assembly comprising at least one optical element corresponding to the image sensor. The system includes a monitor for a user to view a scene and an image signal processing controller. The system includes a light mechanism. The light engine includes an illumination source that generates one or more pulses of electromagnetic radiation. The light mechanism further includes a lumen that transmits one or more pulses of electromagnetic radiation to a distal tip of an endoscope, at least a portion of the one or more pulses of electromagnetic radiation including an excitation wavelength of electromagnetic radiation between 795 nm and 815 nm that causes a reagent to fluoresce at a wavelength that is different from the excitation wavelength of the portion of the one or more pulses of electromagnetic radiation. [00240] [00240] Example 2 is a system as in Example 1, the system additionally comprising a filter that blocks the excitation wavelength of electromagnetic radiation between 795 nm and 815 nm. [00241] [00241] Example 3 is a system as in any of Examples 1 to 2, the filter being located on at least one optical element of the lens assembly, so that the filter blocks the excitation wavelength and allows for a wavelength of the fluorescent reagent through the filter. [00242] [00242] Example 4 is a system like any of Examples 1 to 3, with each pulse of electromagnetic radiation resulting in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as a single image on the monitor. [00243] [00243] Example 5 is a system as in any of Examples 1 to 4, with the single image assigned a visible color for use on the monitor; where the visible color is 8 bits or 16 bits or n bits. [00244] [00244] Example 6 is a system as in any of Examples 1 to 5, with each pulse of electromagnetic radiation resulting in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as an overlay image on the monitor. [00245] [00245] Example 7 is a system as in any of Examples 1 to 6, wherein the overlay image is assigned a visible color for use on the monitor; where the visible color is 8 bits or 16 bits or n bits. [00246] [00246] Example 8 is a system as in any of the [00247] [00247] Example 9 is a system as in any of Examples 1 to 8, with critically important structures in a human body including one of a nerve, a ureter, a blood vessel, an artery, a blood stream and a tumor. [00248] [00248] Example 10 is a system as in any of Examples 1 to 9, wherein one or more critically important structures are cancer cells, and wherein the system receives fluorescent electromagnetic radiation from a molecule that binds a fluorophore that fluoresces when exposed to electromagnetic radiation, having a wavelength between 795 and 815 nm, to one or more of the cancer cells. [00249] [00249] Example 11 is a system as in any of Examples 1 to 10, with each pulse of electromagnetic radiation resulting in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as a single image on the monitor. [00250] [00250] Example 12 is a system as in any of Examples 1 to 11, with the single image assigned a visible color for use on the monitor; where the visible color is 8 bits or 16 bits or n bits. [00251] [00251] Example13 is a system as in any of Examples 1 to 12, where each pulse of electromagnetic radiation results in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as an overlay image on the monitor. [00252] [00252] Example 14 is a system as in any of Examples 1 to 13, wherein the overlay image is assigned a visible color for use on the monitor; where the visible color is 8 bits or 16 bits or n bits. [00253] [00253] —“Example15 is a system as in any of Examples 1 to 14, wherein the illumination source generates one or more pulses of electromagnetic radiation at a wavelength of 370 nm to 420 nm. [00254] [00254] Example 16 is a system as in any of Examples 1 to 15, with each pulse of electromagnetic radiation resulting in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as a single image on the monitor. [00255] [00255] Example 17 is a system as in any of Examples 1 to 16, with the single image assigned a visible color for use on the monitor; where the visible color is 8 bits or 16 bits or n bits. [00256] [00256] Example 18 is a system as in any of Examples 1 to 17, with each pulse of electromagnetic radiation resulting in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as an overlay image on the monitor. [00257] [00257] Example 19 is a system as in any of Examples 1 to 18, wherein the overlay image is assigned a visible color for use on the monitor; where the visible color is 8 bits or 16 bits or n bits. [00258] [00258] —Example20 is a system as in any of Examples 1 to 19, wherein the illumination source generates one or more pulses of electromagnetic radiation at a wavelength of 600 nm to 670 nm. [00259] [00259] Example 21 is a system as in any of Examples 1 to 20, with each pulse of electromagnetic radiation resulting in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as a single image on the monitor. [00260] [00260] Example 22 is a system as in any of Examples 1 to 21, wherein the single image is assigned a visible color for use on the monitor; where the visible color is 8 bits or 16 bits or n bits. [00261] [00261] Example 23 is a system as in any of Examples 1 to 22, with each pulse of electromagnetic radiation resulting in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as an overlay image on the monitor. [00262] [00262] Example 24 is a system as in any of Examples 1 to 23, wherein the overlay image is assigned a visible color for use on the monitor; where the visible color is 8 bits or 16 bits or n bits. [00263] [00263] Example 25 is a system as in any of Examples 1 to 24, the light mechanism comprising a polarization filter. [00264] [00264] Example 26 is a system like any of Examples 1 to 25, with the polarization filter being located in a path of electromagnetic radiation. [00265] [00265] Example 27 is a system as in any of Examples 1 to 26, with the polarization filter being located at a proximal end of the lumen. [00266] [00266] Example 28 is a system as in any of Examples 1 to 27, with the polarization filter located at a distal end of the lumen. [00267] [00267] Example 29 is a system as in any of Examples 1 to 28, the lens assembly comprising an electromagnetic radiation filter. [00268] [00268] Example 30 is a system as in any of Examples 1 to 29, wherein the lens assembly comprises a polarization filter. [00269] [00269] Example31 is a system as in any of Examples 1 to 30, with each pulse of electromagnetic radiation resulting in an exposure frame created by the image sensor; one or more exposure frames being fed to a corresponding system that will provide the location of critical tissue structures. [00270] [00270] Example 32 is a system as in any of Examples 1 to 31, with the location of critical structures being received by the endoscopic system and superimposed on a monitor, with critical structures being coded in any color selected by both an algorithm and a user. [00271] [00271] It will be recognized that various features described in the present invention provide significant advantages and advances in the art. The following claims are examples of some of these features. [00272] [00272] In the Detailed Description of the Description above, several features of the description are grouped into a single modality for the purpose of simplifying the description. This method of description should not be interpreted as reflecting an intention that the claimed description requires more resources than those expressly mentioned in each claim. Rather, aspects of the invention rely on less than all the features of a single embodiment described above. [00273] [00273] It should be understood that all provisions described above are only illustrative of the application of the principles of description. Various modifications and alternative arrangements may be devised by those skilled in the art without departing from the spirit and scope of the description and the appended claims are intended to cover such modifications and provisions. [00274] [00274] Thus, although the description has been shown in the drawings and described above with particularity and detail, it will be apparent to those skilled in the art that various modifications, including, but not limited to, variations in size, materials, shape, shape, function and mode of operation, assembly and use, can be done without departing from the principles and concepts defined in the present invention. [00275] [00275] Additionally, where appropriate, the functions described in the present invention can be performed in one or more of: hardware, software, firmware, digital components or analog components. For example, one or more application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) can be programmed to perform one or more of the systems and procedures described in the present invention. Certain terms are used throughout the description and claims that follow to refer to particular system components. As those skilled in the art will understand, components can be identified by different names. This document is not intended to distinguish between components that differ in name but not in function. [00276] [00276] The above description has been presented for illustrative and descriptive purposes. This description is not intended to be exhaustive or to limit the description to the precise form described. Several modifications and variations are possible in light of the above teaching. Additionally, it should be noted that any and all alternative implementations mentioned above may be used in any desired combination to form additional hybrid implementations of the disclosure. [00277] [00277] — Additionally, while specific implementations of the description have been described and illustrated, the description should not be limited to the specific forms or arrangements of parts so described and illustrated. The scope of the description shall be defined by the appended claims, any future claims presented herein and in different patent applications and their equivalents.
权利要求:
Claims (32) [1] 1. Endoscopic system for use in a light-deficient environment comprising: an imaging device comprising: a tube; one or more image sensors; and a lens assembly comprising at least one optical element corresponding to said image sensor; a monitor for a user to view a scene; an image signal processing controller; and an illumination mechanism, wherein the illumination mechanism comprises: an illumination source that generates one or more pulses of electromagnetic radiation; and a lumen that transmits one or more pulses of electromagnetic radiation to a distal tip of an endoscope, wherein at least a portion of the one or more pulses of electromagnetic radiation includes an excitation wavelength of electromagnetic radiation between 795 nm and 815 nm which causes a reagent to fluoresce at a wavelength that is different from the excitation wavelength of the portion of the one or more pulses of electromagnetic radiation. [2] An endoscopic system as claimed in claim 1, wherein the system further comprises a filter that blocks the excitation wavelength of electromagnetic radiation between 795 nm and 815 nm. [3] Endoscopic system, according to claim 2, characterized in that the filter is located in the at least one optical element of the lens assembly, so that the filter blocks the excitation wavelength and allows a wavelength of the fluorescent reagent through the filter. [4] 4. Endoscopic system, according to claim 2, characterized in that each pulse of electromagnetic radiation results in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as a single image on the monitor. [5] An endoscopic system according to claim 4, characterized in that the single image is assigned a visible color for use on the monitor; where the visible color is 8-bit or 16-bit or n-bit. [6] 6. Endoscopic system, according to claim 2, characterized in that each pulse of electromagnetic radiation results in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as an overlay image on the monitor. [7] An endoscopic system according to claim 6, characterized in that the overlay image is assigned a visible color for use on the monitor; where the visible color is 8-bit or 16-bit or n-bit. [8] An endoscopic system according to claim 1, characterized in that the image sensor detects a wavelength of the fluorescent reagent to provide an image of one or more critically important structures in a human body. [9] An endoscopic system according to claim 8, characterized in that critically important structures in a human body include one of a nerve, a ureter, a blood vessel, an artery, a blood stream and a tumor. [10] 10. Endoscopic system, according to claim 8, characterized in that the one or more structures of critical importance are cancer cells, and in which the system receives fluorescent electromagnetic radiation from a molecule that fixes a fluorophore that fluoresces when exposed to electromagnetic radiation , having a wavelength between 795 nm and 815 nm, to one or more of the cancer cells. [11] 11. Endoscopic system, according to claim 1, characterized in that each pulse of electromagnetic radiation results in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as a single image on the monitor. [12] An endoscopic system as claimed in claim 11, characterized in that the single image is assigned a visible color for use on the monitor; where the visible color is 8-bit or 16-bit or n-bit. [13] 13. Endoscopic system, according to claim 1, characterized in that each pulse of electromagnetic radiation results in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as an overlay image on the monitor. [14] An endoscopic system as claimed in claim 13, characterized in that the overlay image is assigned a visible color for use on the monitor; where the visible color is 8-bit or 16-bit or n-bit. [15] 15. Endoscopic system, according to claim 1, characterized in that the lighting source generates one or more pulses of electromagnetic radiation at a wavelength from 370 nm to 420 nm. [16] 16. Endoscopic system, according to claim 15, characterized in that each pulse of electromagnetic radiation results in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as a single image on the monitor. [17] The endoscopic system of claim 16, characterized in that the single image is assigned a visible color for use on the monitor; where the visible color is 8-bit or 16-bit or n-bit. [18] 18. Endoscopic system, according to claim 15, characterized in that each pulse of electromagnetic radiation results in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as an overlay image on the monitor. [19] An endoscopic system according to claim 18, characterized in that the overlay image is assigned a visible color for use on the monitor; where the visible color is 8-bit or 16-bit or n-bit. [20] 20. Endoscopic system, according to claim 1, characterized in that the illumination source generates one or more pulses of electromagnetic radiation at a wavelength from 600 nm to 670 nm. [21] 21. Endoscopic system, according to claim 20, characterized in that each pulse of electromagnetic radiation results in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as a single image on the monitor. [22] The endoscopic system of claim 21, characterized in that the single image is assigned a visible color for use on the monitor; where the visible color is 8-bit or 16-bit or n-bit. [23] 23. Endoscopic system, according to claim 22, characterized in that each pulse of electromagnetic radiation results in an exposure frame created by the image sensor; where one or more exposure frames are displayed to a user as an overlay image on the monitor. [24] 24. Endoscopic system, according to claim 23, characterized in that the overlay image is assigned a visible color for use on the monitor; where the visible color is 8-bit or 16-bit or n-bit. [25] Endoscopic system according to claim 1, characterized in that the illumination mechanism comprises a polarization filter. [26] 26. Endoscopic system, according to claim 25, characterized in that the polarization filter is located in a path of electromagnetic radiation. [27] 27. Endoscopic system, according to claim 26, characterized in that the polarization filter is located at a proximal end of the lumen. [28] 28. Endoscopic system, according to claim 26, characterized in that the polarization filter is located at a distal end of the lumen. [29] 29. Endoscopic system, according to claim 1, characterized in that the lens assembly comprises an electromagnetic radiation filter. [30] Endoscopic system according to claim 1, characterized in that the lens assembly comprises a polarization filter. [31] 31. Endoscopic system, according to claim 1, characterized in that each pulse of electromagnetic radiation results in an exposure frame created by the image sensor; wherein one or more exposure frames are fed to a corresponding system which will provide the location of critical tissue structures. [32] 32. Endoscopic system, according to claim 31, characterized in that the location of structures of critical importance is received by the endoscopic system and superimposed on a monitor, where critically important structures are coded to any color selected by an algorithm or a user.
类似技术:
公开号 | 公开日 | 专利标题 BR112020012744A2|2020-12-01|fluorescence imaging in a light-deficient environment CN114128243A|2022-03-01|Hyperspectral and fluorescence imaging with topological laser scanning in low light environments US20200404129A1|2020-12-24|Fluorescence imaging in a light deficient environment CN114173633A|2022-03-11|Fluorescence imaging in low light environments CN114175609A|2022-03-11|Hyperspectral imaging in a light deficient environment
同族专利:
公开号 | 公开日 IL275574D0|2020-08-31| BR112020012999A2|2020-12-01| JP2021509337A|2021-03-25| US20190191976A1|2019-06-27| EP3731727A4|2021-10-20| EP3731728A4|2021-08-25| EP3731724A4|2021-10-13| US20190191974A1|2019-06-27| CN111565620A|2020-08-21| US20190191977A1|2019-06-27| IL275571D0|2020-08-31| WO2019133739A1|2019-07-04| US20190191978A1|2019-06-27| KR20200104379A|2020-09-03| EP3731728A1|2020-11-04| EP3731723A1|2020-11-04| EP3731724A1|2020-11-04| BR112020012741A2|2020-12-01| EP3731725A4|2021-10-13| IL275564D0|2020-08-31| WO2019133753A1|2019-07-04| KR20200104373A|2020-09-03| US20190191975A1|2019-06-27| KR20200104371A|2020-09-03| US20190197712A1|2019-06-27| JP2021508542A|2021-03-11| EP3731723A4|2021-10-20| CN111601536A|2020-08-28| JP2021508547A|2021-03-11| WO2019133737A1|2019-07-04| WO2019133741A1|2019-07-04| JP2021508543A|2021-03-11| IL275563D0|2020-08-31| CN111526776A|2020-08-11| EP3731725A1|2020-11-04| WO2019133750A1|2019-07-04| KR20200104375A|2020-09-03| CN111526777A|2020-08-11| JP2021508560A|2021-03-11| BR112020012708A2|2020-11-24| BR112020012594A2|2020-11-24| BR112020012682A2|2020-11-24| EP3731727A1|2020-11-04| WO2019133736A1|2019-07-04| KR20200104372A|2020-09-03| IL275565D0|2020-08-31| CN111526774A|2020-08-11| EP3731726A1|2020-11-04| EP3731726A4|2021-10-27| JP2021508546A|2021-03-11| CN111526775A|2020-08-11| IL275579D0|2020-08-31| KR20200104377A|2020-09-03|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US5318024A|1985-03-22|1994-06-07|Massachusetts Institute Of Technology|Laser endoscope for spectroscopic imaging| US5363387A|1992-11-18|1994-11-08|Rare Earth Medical, Inc.|Variable pulsewidth lasers| US5749830A|1993-12-03|1998-05-12|Olympus Optical Co., Ltd.|Fluorescent endoscope apparatus| US7179222B2|1996-11-20|2007-02-20|Olympus Corporation|Fluorescent endoscope system enabling simultaneous achievement of normal light observation based on reflected light and fluorescence observation based on light with wavelengths in infrared spectrum| NL1005068C2|1997-01-23|1998-07-27|Ct Rrn Academisch Ziekenhuis U|Catheter system and a catheter forming part thereof.| CN1341003A|1999-01-26|2002-03-20|牛顿实验室公司|Autofluorescence imaging system for endoscopy| US6468265B1|1998-11-20|2002-10-22|Intuitive Surgical, Inc.|Performing cardiac surgery without cardioplegia| JP3850192B2|1999-12-09|2006-11-29|株式会社ニデック|Fundus photographing device| US6975898B2|2000-06-19|2005-12-13|University Of Washington|Medical imaging, diagnosis, and therapy using a scanning single optical fiber system| JP2002253500A|2001-03-05|2002-09-10|Olympus Optical Co Ltd|Light source device for endoscope| US7760955B2|2001-07-12|2010-07-20|Do Labs|Method and system for producing formatted information related to defects of appliances| US6986739B2|2001-08-23|2006-01-17|Sciperio, Inc.|Architecture tool and methods of use| US8620410B2|2002-03-12|2013-12-31|Beth Israel Deaconess Medical Center|Multi-channel medical imaging system| GB0217570D0|2002-07-30|2002-09-11|Univ Birmingham|Method and apparatus for quantifying material or object properties| US7448995B2|2003-06-23|2008-11-11|Microvision, Inc.|Scanning endoscope| WO2005043319A2|2003-10-21|2005-05-12|The Board Of Trustees Of The Leland Stanford Junior University|Systems and methods for intraoperative targeting| US7532375B2|2004-09-24|2009-05-12|Hoya Corporation|Tuning-fork-type scanning apparatus with a counterweight| US8480566B2|2004-09-24|2013-07-09|Vivid Medical, Inc.|Solid state illumination for endoscopy| JP5028008B2|2004-12-08|2012-09-19|オリンパス株式会社|Fluorescence endoscope device| US20070086495A1|2005-08-12|2007-04-19|Sprague Randall B|Method and apparatus for stable laser drive| JP5114024B2|2005-08-31|2013-01-09|オリンパス株式会社|Optical imaging device| US20080058629A1|2006-08-21|2008-03-06|University Of Washington|Optical fiber scope with both non-resonant illumination and resonant collection/imaging for multiple modes of operation| DE102006046925A1|2006-09-28|2008-04-03|Jenlab Gmbh|Method for laser endoscopy e.g. for medical work and for semiconductor processing, requires laser pulse for producing multi-photon processes as target ionization| CA2675890A1|2007-01-19|2008-07-24|University Health Network|Electrostatically driven imaging probe| US20080177140A1|2007-01-23|2008-07-24|Xillix Technologies Corp.|Cameras for fluorescence and reflectance imaging| JP2008259595A|2007-04-10|2008-10-30|Hamamatsu Photonics Kk|Fluorescence observation apparatus| EP2207595A4|2007-10-19|2012-10-24|Lockheed Corp|System and method for conditioning animal tissue using laser light| WO2009064746A2|2007-11-12|2009-05-22|Cornell University|Multi-path, multi-magnification, non-confocal fluorescence emission endoscopy apparatus and methods| JP5165400B2|2008-01-23|2013-03-21|オリンパス株式会社|Light source device| DE602009001103D1|2008-06-04|2011-06-01|Fujifilm Corp|Lighting device for use in endoscopes| EP3396416A1|2008-11-25|2018-10-31|Tetravue, Inc.|Systems and methods of high resolution three-dimensional imaging| JP5874116B2|2009-07-30|2016-03-02|国立研究開発法人産業技術総合研究所|Image photographing apparatus and image photographing method| US8986302B2|2009-10-09|2015-03-24|Ethicon Endo-Surgery, Inc.|Surgical generator for ultrasonic and electrosurgical devices| BR112012016973A2|2010-01-13|2017-09-26|Koninl Philips Electronics Nv|surgical navigation system for integrating a plurality of images of an anatomical region of a body, including a digitized preoperative image, a fluoroscopic intraoperative image, and an endoscopic intraoperative image| US9044142B2|2010-03-12|2015-06-02|Carl Zeiss Meditec Ag|Surgical optical systems for detecting brain tumors| EP2560569B1|2010-04-22|2020-10-21|Precise Light Surgical, Inc.|Flash vaporization surgical systems| JP2012016545A|2010-07-09|2012-01-26|Fujifilm Corp|Endoscope apparatus| WO2012065163A2|2010-11-12|2012-05-18|Emory University|Additional systems and methods for providing real-time anatomical guidance in a diagnostic or therapeutic procedure| US9979949B2|2011-07-13|2018-05-22|Viking Systems, Inc|Method and apparatus for obtaining stereoscopic 3D visualization using commercially available 2D endoscopes| WO2013101669A2|2011-12-27|2013-07-04|Parasher Vinod|METHODS AND DEVICES FOR GASTROINTESTINAL SURGICAL PROCEDURES USING NEAR INFRARED IMAGING TECHNIQUES| JP5918548B2|2012-01-24|2016-05-18|富士フイルム株式会社|Endoscopic image diagnosis support apparatus, operation method thereof, and endoscopic image diagnosis support program| EP2719317A4|2012-03-30|2015-05-06|Olympus Medical Systems Corp|Endoscopic device| WO2015077493A1|2013-11-20|2015-05-28|Digimarc Corporation|Sensor-synchronized spectrally-structured-light imaging| JP5965726B2|2012-05-24|2016-08-10|オリンパス株式会社|Stereoscopic endoscope device| WO2014014838A2|2012-07-15|2014-01-23|2R1Y|Interactive illumination for gesture and/or object recognition| KR102278509B1|2012-07-26|2021-07-19|디퍼이 신테스 프로덕츠, 인코포레이티드|Continuous video in a light deficient environment| US9516239B2|2012-07-26|2016-12-06|DePuy Synthes Products, Inc.|YCBCR pulsed illumination scheme in a light deficient environment| WO2014134314A1|2013-03-01|2014-09-04|The Johns Hopkins University|Light sources, medical devices, and methods of illuminating an object of interest| US9456752B2|2013-03-14|2016-10-04|Aperture Diagnostics Ltd.|Full-field three-dimensional surface measurement| WO2014171284A1|2013-04-19|2014-10-23|オリンパスメディカルシステムズ株式会社|Endoscope device| JP5891208B2|2013-08-13|2016-03-22|Hoya株式会社|Endoscope illumination optics| JP2017513645A|2014-04-28|2017-06-01|カーディオフォーカス,インコーポレーテッド|System and method for visualizing tissue using an ICG dye composition during an ablation procedure| US9547165B2|2014-08-29|2017-01-17|Reinroth Gmbh|Endoscope system with single camera for concurrent imaging at visible and infrared wavelengths| CN107708518A|2015-06-17|2018-02-16|奥林巴斯株式会社|Camera device| US10708478B2|2016-03-23|2020-07-07|Karl Storz Imaging, Inc.|Image transformation and display for fluorescent and visible imaging| US20170280970A1|2016-03-31|2017-10-05|Covidien Lp|Thoracic endoscope for surface scanning| US10690904B2|2016-04-12|2020-06-23|Stryker Corporation|Multiple imaging modality light source| WO2017201093A1|2016-05-17|2017-11-23|Hypermed Imaging, Inc.|Hyperspectral imager coupled with indicator molecule tracking| WO2017221336A1|2016-06-21|2017-12-28|オリンパス株式会社|Endoscope system, image processing device, image processing method, and program|CA2878513A1|2012-07-26|2014-01-30|Olive Medical Corporation|Wide dynamic range using monochromatic sensor| KR102278509B1|2012-07-26|2021-07-19|디퍼이 신테스 프로덕츠, 인코포레이티드|Continuous video in a light deficient environment| JP2017099616A|2015-12-01|2017-06-08|ソニー株式会社|Surgical control device, surgical control method and program, and surgical system| US10694117B2|2018-06-07|2020-06-23|Curadel, LLC|Masking approach for imaging multi-peak fluorophores by an imaging system| US11100628B2|2019-02-07|2021-08-24|Applied Materials, Inc.|Thickness measurement of substrate using color metrology| US11112865B1|2019-02-13|2021-09-07|Facebook Technologies, Llc|Systems and methods for using a display as an illumination source for eye tracking| US11218645B2|2019-06-20|2022-01-04|Cilag Gmbh International|Wide dynamic range using a monochrome image sensor for fluorescence imaging| US20200402228A1|2019-06-20|2020-12-24|Ethicon Llc|Image rotation in an endoscopic hyperspectral imaging system| US20200404189A1|2019-06-20|2020-12-24|Ethicon Llc|Image rotation in an endoscopic laser mapping imaging system| US20200404144A1|2019-06-20|2020-12-24|Ethicon Llc|Speckle removal in a pulsed hyperspectral, fluorescence, and laser mapping imaging system| US20200397260A1|2019-06-20|2020-12-24|Ethicon Llc|Optical fiber waveguide in an endoscopic system for fluorescence imaging| US10841504B1|2019-06-20|2020-11-17|Ethicon Llc|Fluorescence imaging with minimal area monolithic image sensor| US20200397247A1|2019-06-20|2020-12-24|Ethicon Llc|Speckle removal in a pulsed fluorescence imaging system| US20200397245A1|2019-06-20|2020-12-24|Ethicon Llc|Minimizing image sensor input/output in a pulsed fluorescence imaging system| US11237270B2|2019-06-20|2022-02-01|Cilag Gmbh International|Hyperspectral, fluorescence, and laser mapping imaging with fixed pattern noise cancellation| US11187657B2|2019-06-20|2021-11-30|Cilag Gmbh International|Hyperspectral imaging with fixed pattern noise cancellation| US20200397350A1|2019-06-20|2020-12-24|Ethicon Llc|Super resolution and color motion artifact correction in a pulsed hyperspectral imaging system| US20200397249A1|2019-06-20|2020-12-24|Ethicon Llc|Speckle removal in a pulsed fluorescence imaging system| US11265491B2|2019-06-20|2022-03-01|Cilag Gmbh International|Fluorescence imaging with fixed pattern noise cancellation| US11240426B2|2019-06-20|2022-02-01|Cilag Gmbh International|Pulsed illumination in a hyperspectral, fluorescence, and laser mapping imaging system| US11154188B2|2019-06-20|2021-10-26|Cilag Gmbh International|Laser mapping imaging and videostroboscopy of vocal cords| US11172810B2|2019-06-20|2021-11-16|Cilag Gmbh International|Speckle removal in a pulsed laser mapping imaging system| US11134832B2|2019-06-20|2021-10-05|Cilag Gmbh International|Image rotation in an endoscopic hyperspectral, fluorescence, and laser mapping imaging system| US11233960B2|2019-06-20|2022-01-25|Cilag Gmbh International|Fluorescence imaging with fixed pattern noise cancellation| US10952619B2|2019-06-20|2021-03-23|Ethicon Llc|Hyperspectral and fluorescence imaging and topology laser mapping with minimal area monolithic image sensor| US11187658B2|2019-06-20|2021-11-30|Cilag Gmbh International|Fluorescence imaging with fixed pattern noise cancellation| US20200397259A1|2019-06-20|2020-12-24|Ethicon Llc|Offset illumination of a scene using multiple emitters in a fluorescence imaging system| US11122968B2|2019-06-20|2021-09-21|Cilag Gmbh International|Optical fiber waveguide in an endoscopic system for hyperspectral imaging| US20200397302A1|2019-06-20|2020-12-24|Ethicon Llc|Fluorescence imaging in a light deficient environment| US20200400823A1|2019-06-20|2020-12-24|Ethicon Llc|Wide dynamic range using a monochrome image sensor for laser mapping imaging| US11221414B2|2019-06-20|2022-01-11|Cilag Gmbh International|Laser mapping imaging with fixed pattern noise cancellation| US11076747B2|2019-06-20|2021-08-03|Cilag Gmbh International|Driving light emissions according to a jitter specification in a laser mapping imaging system| US10979646B2|2019-06-20|2021-04-13|Ethicon Llc|Fluorescence imaging with minimal area monolithic image sensor| US11012599B2|2019-06-20|2021-05-18|Ethicon Llc|Hyperspectral imaging in a light deficient environment| US20200400833A1|2019-06-20|2020-12-24|Ethicon Llc|Offset illumination of a scene using multiple emitters in a laser mapping imaging system| US20200404150A1|2019-06-20|2020-12-24|Ethicon Llc|Speckle removal in a pulsed hyperspectral imaging system| US11172811B2|2019-06-20|2021-11-16|Cilag Gmbh International|Image rotation in an endoscopic fluorescence imaging system| US20200397267A1|2019-06-20|2020-12-24|Ethicon Llc|Speckle removal in a pulsed fluorescence imaging system| DE102019134473A1|2019-12-16|2021-06-17|Hoya Corporation|Live calibration| WO2021229900A1|2020-05-13|2021-11-18|富士フイルム株式会社|Endoscope system and method for operating same|
法律状态:
2021-12-07| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US201762610888P| true| 2017-12-27|2017-12-27| US62/610,888|2017-12-27| US201862723989P| true| 2018-08-28|2018-08-28| US62/723,989|2018-08-28| PCT/US2018/067729|WO2019133739A1|2017-12-27|2018-12-27|Fluorescence imaging in a light deficient environment| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|